22  Sensitivity Analysis and Robustness in Dynamics

Status: Draft

v0.4

22.1 Learning Objectives

After reading this chapter, you will be able to:

  • Assess sensitivity to parameters in dynamical systems
  • Assess sensitivity to initial conditions
  • Assess sensitivity to structural perturbations
  • Compute robustness metrics for dynamical systems
  • Understand how interventions affect system robustness
  • Use adjoint methods for efficient gradient computation
  • Perform global sensitivity analysis using Sobol indices
  • Propagate uncertainty through dynamical models
  • Interpret E-values for sensitivity to unmeasured confounding
  • Apply Rosenbaum bounds in matched observational studies
  • Understand Manski partial identification and bounds on causal effects
  • Connect causal sensitivity analysis to dynamical system robustness

22.2 Introduction

This chapter shows how to assess sensitivity and robustness in dynamical systems—how do results change with parameter variations, structural changes, or different initial conditions? This is part of Imagining in the Dynamical world—exploring how system behaviour changes under different assumptions.

22.3 Sensitivity to Parameters

22.3.1 Parameter Sensitivity

Question: How do system dynamics change with parameter variations?

Methods:

  • Local sensitivity: Derivatives \(\frac{\partial X_t}{\partial \theta}\) (how small changes affect dynamics)
  • Global sensitivity: How large parameter changes affect system behaviour
  • Sensitivity indices: Sobol indices, variance-based sensitivity
  • Adjoint sensitivity: Efficient gradient computation for many parameters

22.3.2 Example: SIR Model Sensitivity

In an SIR model, how sensitive is the peak infection rate to:

  • Transmission rate \(\beta\)?
  • Recovery rate \(\gamma\)?
  • Initial conditions?

22.3.3 Implementation: Parameter Sensitivity Analysis

We can assess parameter sensitivity by varying parameters and observing how outcomes change:

# Find project root and include ensure_packages.jl
project_root = let
    current = pwd()
    while !isfile(joinpath(current, "Project.toml")) && !isfile(joinpath(current, "_quarto.yml"))
        parent = dirname(current)
        parent == current && break
        current = parent
    end
    current
end
include(joinpath(project_root, "scripts", "ensure_packages.jl"))

@auto_using DifferentialEquations CairoMakie

# Base SIR model
function sir!(du, u, p, t)
    """SIR epidemic model: Susceptible → Infected → Recovered."""
    S, I, R = u
    β, γ = p
    du[1] = -β * S * I
    du[2] = β * S * I - γ * I
    du[3] = γ * I
end

u0 = [0.99, 0.01, 0.0]
tspan = (0.0, 100.0)

# Base parameters
β_base = 0.3
γ_base = 0.1

# Vary β and compute peak infection
β_values = 0.2:0.05:0.4
peak_infections_β = Float64[]

for β in β_values
    p = (β, γ_base)  # Using tuple for better performance
    prob = ODEProblem(sir!, u0, tspan, p)
    sol = solve(prob, Tsit5())
    peak = maximum([u[2] for u in sol.u])
    push!(peak_infections_β, peak)
end

# Vary γ and compute peak infection
γ_values = 0.05:0.01:0.15
peak_infections_γ = Float64[]

for γ in γ_values
    p = (β_base, γ)  # Using tuple for better performance
    prob = ODEProblem(sir!, u0, tspan, p)
    sol = solve(prob, Tsit5())
    peak = maximum([u[2] for u in sol.u])
    push!(peak_infections_γ, peak)
end

# Visualise sensitivity
let
fig = Figure(size = (1000, 400))
ax1 = Axis(fig[1, 1], title = "Sensitivity to Transmission Rate β",
           xlabel = "β", ylabel = "Peak Infection Rate")
ax2 = Axis(fig[1, 2], title = "Sensitivity to Recovery Rate γ",
           xlabel = "γ", ylabel = "Peak Infection Rate")

lines!(ax1, β_values, peak_infections_β, linewidth = 2, color = :blue)
scatter!(ax1, β_values, peak_infections_β, color = :blue, markersize = 8)

lines!(ax2, γ_values, peak_infections_γ, linewidth = 2, color = :red)
scatter!(ax2, γ_values, peak_infections_γ, color = :red, markersize = 8)

    fig  # Only this gets displayed
end

# Compute local sensitivity (derivative approximation)
sensitivity_β = (peak_infections_β[end] - peak_infections_β[1]) / (β_values[end] - β_values[1])
sensitivity_γ = (peak_infections_γ[end] - peak_infections_γ[1]) / (γ_values[end] - γ_values[1])

println("Local sensitivity (approximate derivatives):")
println("  ∂(peak I)/∂β ≈ ", round(sensitivity_β, digits=3))
println("  ∂(peak I)/∂γ ≈ ", round(sensitivity_γ, digits=3))
println("\nInterpretation:")
println("  Peak infection is ", abs(sensitivity_β) > abs(sensitivity_γ) ? "more" : "less",
        " sensitive to β than to γ")
Local sensitivity (approximate derivatives):
  ∂(peak I)/∂β ≈ 1.222
  ∂(peak I)/∂γ ≈ -3.816

Interpretation:
  Peak infection is less sensitive to β than to γ

22.3.4 Adjoint Sensitivity Methods

For systems with many parameters, computing sensitivities using finite differences becomes expensive (\(O(P)\) where \(P\) is the number of parameters). Adjoint sensitivity methods provide an efficient alternative that scales as \(O(1)\) in the number of parameters (Rackauckas 2026).

The Adjoint Method: Instead of computing \(\frac{\partial X_t}{\partial \theta}\) for each parameter, we solve a backward ODE:

\[ \lambda'(t) = -\frac{df}{du}^T \lambda(t) + \left(\frac{dg}{du}\right)^T, \quad \lambda(T) = 0 \]

where:

  • \(\lambda(t)\) is the adjoint state (gradient with respect to \(u(t)\))
  • \(f\) is the ODE right-hand side
  • \(g\) is the cost function

Advantages:

  • Efficient: \(O(1)\) cost regardless of number of parameters
  • Automatic: SciMLSensitivity.jl handles this automatically
  • Memory efficient: Uses adjoint method internally

Implementation:

# Find project root and include ensure_packages.jl
project_root = let
    current = pwd()
    while !isfile(joinpath(current, "Project.toml")) && !isfile(joinpath(current, "_quarto.yml"))
        parent = dirname(current)
        parent == current && break
        current = parent
    end
    current
end
include(joinpath(project_root, "scripts", "ensure_packages.jl"))

@auto_using DifferentialEquations SciMLSensitivity Zygote CairoMakie

# SIR model
function sir!(du, u, p, t)
    S, I, R = u
    β, γ = p
    du[1] = -β * S * I
    du[2] = β * S * I - γ * I
    du[3] = γ * I
end

u0 = [0.99, 0.01, 0.0]
p0 = [0.3, 0.1]  # β, γ (use vector for adjoint sensitivity)
tspan = (0.0, 50.0)

# Cost function: peak infection rate (use saveat and sensealg for stable AD)
function cost(p)
    prob = ODEProblem(sir!, u0, tspan, p)
    sol = solve(prob, Tsit5(); saveat = 0.5, sensealg = InterpolatingAdjoint(autojacvec = true))
    if string(sol.retcode) != "Success"
        return Inf
    end
    peak_I = maximum([u[2] for u in sol.u])
    return peak_I
end

# Compute gradient using adjoint method (automatic via Zygote)
# Note: For adjoint methods, parameters should be a vector, not a tuple
grad = Zygote.gradient(cost, p0)
cost_val = cost(p0)

println("Adjoint sensitivity analysis:")
println("  Cost (peak I): ", cost_val == Inf ? "Inf (solve failed)" : round(cost_val, digits = 4))
if cost_val != Inf && grad[1] !== nothing
    g = grad[1]
    println("  Gradient ∂(peak I)/∂β: ", round(g[1], digits = 4))
    println("  Gradient ∂(peak I)/∂γ: ", round(g[2], digits = 4))
    println("\nInterpretation:")
    println("  Increasing β by 0.01 → peak I changes by ≈ ", round(g[1] * 0.01, digits = 4))
    println("  Increasing γ by 0.01 → peak I changes by ≈ ", round(g[2] * 0.01, digits = 4))
else
    println("  Gradient: not available (solve failed or gradient undefined)")
end
Adjoint sensitivity analysis:
  Cost (peak I): 0.3038
  Gradient ∂(peak I)/∂β: 1.2467
  Gradient ∂(peak I)/∂γ: -3.6425

Interpretation:
  Increasing β by 0.01 → peak I changes by ≈ 0.0125
  Increasing γ by 0.01 → peak I changes by ≈ -0.0364

When to Use Adjoint Methods:

Connection to Parameter Estimation: Adjoint methods are essential for efficient parameter estimation via shooting method (see Parameter Estimation via Shooting Method).

22.4 Sensitivity to Initial Conditions

22.4.1 Initial Condition Sensitivity

Question: How do system dynamics change with different initial conditions?

Methods:

  • Lyapunov exponents: Measure rate of divergence/convergence
  • Basin analysis: Which initial conditions lead to which attractors?
  • Sensitivity to perturbations: How do small initial perturbations affect trajectories?

22.4.2 Implementation: Initial Condition Sensitivity

We can assess sensitivity to initial conditions by comparing trajectories from slightly different starting points:

# Find project root and include ensure_packages.jl
project_root = let
    current = pwd()
    while !isfile(joinpath(current, "Project.toml")) && !isfile(joinpath(current, "_quarto.yml"))
        parent = dirname(current)
        parent == current && break
        current = parent
    end
    current
end
include(joinpath(project_root, "scripts", "ensure_packages.jl"))

@auto_using DifferentialEquations CairoMakie

# Example: Disease model
function disease!(du, u, p, t)
    """Disease model: logistic growth with growth rate β."""
    X = u[1]
    β = p[1]
    du[1] = β * X * (1 - X)  # Logistic growth
end

β = 0.2
p = (β,)  # Using tuple for better performance (note: single-element tuple needs comma)
tspan = (0.0, 30.0)

# Base initial condition
X0_base = 0.1

# Perturbed initial conditions
perturbations = [-0.05, -0.02, 0.0, 0.02, 0.05]
trajectories = []

for pert in perturbations
    X0 = X0_base + pert
    u0 = [X0]
    prob = ODEProblem(disease!, u0, tspan, p)
    sol = solve(prob, Tsit5())
    push!(trajectories, sol)
end

# Visualise sensitivity
let
fig = Figure(size = (800, 400))
ax = Axis(fig[1, 1], title = "Initial Condition Sensitivity",
          xlabel = "Time", ylabel = "Disease Severity")

colors = [:blue, :lightblue, :black, :lightcoral, :red]
for (i, sol) in enumerate(trajectories)
    label = i == 3 ? "Base (X₀ = $(X0_base))" : "X₀ = $(X0_base + perturbations[i])"
    lines!(ax, sol.t, [u[1] for u in sol.u], label = label,
           linewidth = 2, color = colors[i], linestyle = i == 3 ? :solid : :dash)
end

axislegend(ax, position = :rt)

    fig  # Only this gets displayed
end

# Compute divergence (difference at final time)
final_values = [sol.u[end][1] for sol in trajectories]
divergence = maximum(final_values) - minimum(final_values)
println("Divergence at final time: ", round(divergence, digits=3))
println("Initial perturbation range: ", round(maximum(perturbations) - minimum(perturbations), digits=3))
println("Amplification factor: ", round(divergence / (maximum(perturbations) - minimum(perturbations)), digits=2))
Divergence at final time: 0.031
Initial perturbation range: 0.1
Amplification factor: 0.31

22.5 Sensitivity to Structural Perturbations

22.5.1 Structural Sensitivity

Question: How do system dynamics change with structural changes (edge removal, node removal)?

Methods:

  • Edge removal: Remove edges and observe system behaviour
  • Node removal: Remove nodes and observe system behaviour
  • Structural robustness: How robust is system to structural changes?

22.5.2 Implementation: Structural Sensitivity

We can assess structural sensitivity by modifying the system structure (e.g., removing interactions):

# Find project root and include ensure_packages.jl
project_root = let
    current = pwd()
    while !isfile(joinpath(current, "Project.toml")) && !isfile(joinpath(current, "_quarto.yml"))
        parent = dirname(current)
        parent == current && break
        current = parent
    end
    current
end
include(joinpath(project_root, "scripts", "ensure_packages.jl"))

@auto_using DifferentialEquations CairoMakie

# Example: Two-compartment model with interaction
# Original: X₁ → X₂ (X₁ affects X₂)
# Modified: Remove interaction (X₁ and X₂ independent)

# Original system with interaction
function system_original!(du, u, p, t)
    """Two-species system: X₁ affects X₂ via interaction term."""
    X₁, X₂ = u
    α, β, γ = p
    du[1] = α * X₁ * (1 - X₁)  # X₁ grows logistically
    du[2] = β * X₂ * (1 - X₂) + γ * X₁  # X₂ grows + affected by X₁
end

# Modified system without interaction (structural change)
function system_modified!(du, u, p, t)
    """Two-species system: interaction removed (structural change)."""
    X₁, X₂ = u
    α, β, γ = p
    du[1] = α * X₁ * (1 - X₁)  # X₁ unchanged
    du[2] = β * X₂ * (1 - X₂)  # X₂: removed interaction with X₁ (γ * X₁ term removed)
end

α, β, γ = 0.2, 0.15, 0.1
p = (α, β, γ)  # Using tuple for better performance
u0 = [0.1, 0.1]
tspan = (0.0, 30.0)

# Solve both systems
prob_original = ODEProblem(system_original!, u0, tspan, p)
sol_original = solve(prob_original, Tsit5())

prob_modified = ODEProblem(system_modified!, u0, tspan, p)
sol_modified = solve(prob_modified, Tsit5())

# Visualise comparison
let
fig = Figure(size = (1000, 400))
ax1 = Axis(fig[1, 1], title = "X₁ (unchanged)", xlabel = "Time", ylabel = "Value")
ax2 = Axis(fig[1, 2], title = "X₂ (affected by structural change)", xlabel = "Time", ylabel = "Value")

lines!(ax1, sol_original.t, [u[1] for u in sol_original.u],
       label = "Original", linewidth = 2, color = :blue)
lines!(ax1, sol_modified.t, [u[1] for u in sol_modified.u],
       label = "Modified (no interaction)", linewidth = 2, color = :red, linestyle = :dash)
axislegend(ax1)

lines!(ax2, sol_original.t, [u[2] for u in sol_original.u],
       label = "Original", linewidth = 2, color = :blue)
lines!(ax2, sol_modified.t, [u[2] for u in sol_modified.u],
       label = "Modified (no interaction)", linewidth = 2, color = :red, linestyle = :dash)
axislegend(ax2)

    fig  # Only this gets displayed
end

# Compare final states
final_X2_original = sol_original.u[end][2]
final_X2_modified = sol_modified.u[end][2]
difference = abs(final_X2_original - final_X2_modified)

println("Structural sensitivity analysis:")
println("  Final X₂ (original): ", round(final_X2_original, digits=3))
println("  Final X₂ (modified, no interaction): ", round(final_X2_modified, digits=3))
println("  Difference: ", round(difference, digits=3))
println("  System is ", difference > 0.1 ? "sensitive" : "robust", " to this structural change")
Structural sensitivity analysis:
  Final X₂ (original): 1.431
  Final X₂ (modified, no interaction): 0.909
  Difference: 0.522
  System is sensitive to this structural change

22.6 Robustness Metrics

22.6.1 Local Robustness

Local robustness measures response to small perturbations:

  • Eigenvalue analysis: Linear stability
  • Lyapunov exponents: Rate of divergence/convergence
  • Jacobian analysis: Local linearisation

22.6.2 Global Robustness

Global robustness measures response to large perturbations:

  • Basin of attraction: Region of state space that converges to equilibrium
  • Global Lyapunov function: Energy-like function that decreases
  • Structural robustness: Robustness to edge/node removal

22.6.3 Implementation: Robustness Metrics

We can compute robustness metrics by testing system response to perturbations:

# Find project root and include ensure_packages.jl
project_root = let
    current = pwd()
    while !isfile(joinpath(current, "Project.toml")) && !isfile(joinpath(current, "_quarto.yml"))
        parent = dirname(current)
        parent == current && break
        current = parent
    end
    current
end
include(joinpath(project_root, "scripts", "ensure_packages.jl"))

@auto_using DifferentialEquations CairoMakie

# Example: System with equilibrium
function system!(du, u, p, t)
    """Linear system converging to equilibrium X_eq with rate α."""
    X = u[1]
    α, X_eq = p
    # System converges to equilibrium X_eq
    du[1] = -α * (X - X_eq)
end

α = 0.5
X_eq = 0.5
p = (α, X_eq)  # Using tuple for better performance
tspan = (0.0, 10.0)

# Test robustness: start from different initial conditions
initial_conditions = [0.1, 0.3, 0.5, 0.7, 0.9]
trajectories = []
final_distances = Float64[]

for X0 in initial_conditions
    u0 = [X0]
    prob = ODEProblem(system!, u0, tspan, p)
    sol = solve(prob, Tsit5())
    push!(trajectories, sol)
    # Distance from equilibrium at final time
    distance = abs(sol.u[end][1] - X_eq)
    push!(final_distances, distance)
end

# Visualise convergence to equilibrium
let
fig = Figure(size = (800, 400))
ax = Axis(fig[1, 1], title = "Robustness: Convergence to Equilibrium",
          xlabel = "Time", ylabel = "X")

for (i, sol) in enumerate(trajectories)
    lines!(ax, sol.t, [u[1] for u in sol.u], linewidth = 2)
end

# Mark equilibrium
hlines!(ax, [X_eq], color = :red, linestyle = :dash, linewidth = 2, label = "Equilibrium")
axislegend(ax)

    fig  # Only this gets displayed
end

# Robustness metric: maximum distance from equilibrium
max_distance = maximum(final_distances)
println("Robustness metrics:")
println("  Equilibrium: ", X_eq)
println("  Maximum distance from equilibrium: ", round(max_distance, digits=4))
println("  System is ", max_distance < 0.01 ? "highly robust" : max_distance < 0.1 ? "robust" : "less robust",
        " (converges to equilibrium from all tested initial conditions)")
Robustness metrics:
  Equilibrium: 0.5
  Maximum distance from equilibrium: 0.0027
  System is highly robust (converges to equilibrium from all tested initial conditions)

22.6.4 Global Sensitivity Analysis

Global sensitivity analysis assesses how system behaviour changes across the entire parameter space, not just small perturbations (Saltelli et al. 2008; Rackauckas 2026). This is particularly important for:

  • Nonlinear systems: Where local sensitivity may not capture full behaviour
  • Parameter interactions: When parameters interact nonlinearly
  • Robustness assessment: Understanding which parameters matter most across their full ranges

Sobol Indices: Variance-based sensitivity measures that decompose output variance into contributions from each parameter:

  • First-order indices: \(S_i = \frac{\text{Var}(\mathbb{E}[Y \mid X_i])}{\text{Var}(Y)}\) — main effect of parameter \(i\)
  • Total-order indices: \(S_{Ti} = 1 - \frac{\text{Var}(\mathbb{E}[Y \mid X_{\sim i}])}{\text{Var}(Y)}\) — total effect including interactions

When to Use:

  • Global sensitivity: Large parameter ranges, nonlinear interactions, robustness analysis
  • Local sensitivity: Small perturbations, linear regime, gradient-based optimisation

Implementation:

# Find project root and include ensure_packages.jl
project_root = let
    current = pwd()
    while !isfile(joinpath(current, "Project.toml")) && !isfile(joinpath(current, "_quarto.yml"))
        parent = dirname(current)
        parent == current && break
        current = parent
    end
    current
end
include(joinpath(project_root, "scripts", "ensure_packages.jl"))

@auto_using DifferentialEquations Random CairoMakie

# Check if GlobalSensitivity is available
global_sensitivity_available = false
try
    using GlobalSensitivity
    global_sensitivity_available = true
catch
    println("Note: GlobalSensitivity.jl not available. This example requires GlobalSensitivity.jl.")
end

if global_sensitivity_available

# SIR model
function sir!(du, u, p, t)
    S, I, R = u
    β, γ = p
    du[1] = -β * S * I
    du[2] = β * S * I - γ * I
    du[3] = γ * I
end

# Model function for sensitivity analysis
function model_function(p)
    β, γ = p
    u0 = [0.99, 0.01, 0.0]
    tspan = (0.0, 50.0)
    prob = ODEProblem(sir!, u0, tspan, (β, γ))
    sol = solve(prob, Tsit5())
    if sol.retcode != :Success
        return 0.0
    end
    # Output: peak infection rate
    return maximum([u[2] for u in sol.u])
end

# Parameter bounds: β ∈ [0.1, 0.5], γ ∈ [0.05, 0.2]
bounds = [(0.1, 0.5), (0.05, 0.2)]

# Sobol sensitivity analysis
Random.seed!(42)
sobol_result = gsa(model_function, Sobol(), bounds, samples = 1000)

# Visualise results
let
fig = Figure(size = (1000, 400))
ax1 = Axis(fig[1, 1], title = "First-Order Sobol Indices",
           xlabel = "Parameter", ylabel = "Sensitivity Index",
           xticks = ([1, 2], ["β (transmission)", "γ (recovery)"]))
ax2 = Axis(fig[1, 2], title = "Total-Order Sobol Indices",
           xlabel = "Parameter", ylabel = "Sensitivity Index",
           xticks = ([1, 2], ["β (transmission)", "γ (recovery)"]))

barplot!(ax1, [1, 2], sobol_result.S1, color = [:blue, :red])
barplot!(ax2, [1, 2], sobol_result.ST, color = [:blue, :red])

# Add value labels
for (i, val) in enumerate(sobol_result.S1)
    text!(ax1, i, val + 0.02, text = string(round(val, digits=3)),
          align = (:center, :bottom), fontsize = 12)
end
for (i, val) in enumerate(sobol_result.ST)
    text!(ax2, i, val + 0.02, text = string(round(val, digits=3)),
          align = (:center, :bottom), fontsize = 12)
end

fig
end

println("Global sensitivity analysis (Sobol indices):")
println("  First-order indices (main effects):")
println("    S_β = ", round(sobol_result.S1[1], digits=3))
println("    S_γ = ", round(sobol_result.S1[2], digits=3))
println("  Total-order indices (including interactions):")
println("    ST_β = ", round(sobol_result.ST[1], digits=3))
println("    ST_γ = ", round(sobol_result.ST[2], digits=3))
println("\nInterpretation:")
println("  β has ", sobol_result.S1[1] > sobol_result.S1[2] ? "larger" : "smaller",
        " main effect on peak infection")
if sobol_result.ST[1] > sobol_result.S1[1] || sobol_result.ST[2] > sobol_result.S1[2]
    println("  Parameter interactions are present (ST > S1)")
end
else
    println("Skipping global sensitivity analysis: GlobalSensitivity.jl not available")
end
Warning: The `generate_design_matrices(n, d, sampler, R = NoRand(), num_mats)` method does not produces true and independent QMC matrices, see [this doc warning](https://docs.sciml.ai/QuasiMonteCarlo/stable/design_matrix/) for more context.
    Prefer using randomization methods such as `R = Shift()`, `R = MatousekScrambling()`, etc., see [documentation](https://docs.sciml.ai/QuasiMonteCarlo/stable/randomization/)
@ QuasiMonteCarlo ~/.julia/packages/QuasiMonteCarlo/sBroe/src/RandomizedQuasiMonteCarlo/iterators.jl:269
Global sensitivity analysis (Sobol indices):
  First-order indices (main effects):
    S_β = NaN
    S_γ = NaN
  Total-order indices (including interactions):
    ST_β = NaN
    ST_γ = NaN

Interpretation:
  β has smaller main effect on peak infection

Key Points:

  • Global sensitivity captures nonlinear effects across full parameter ranges
  • Sobol indices quantify both main effects and interactions
  • Essential for understanding which parameters matter most for robustness
  • Complements local sensitivity for comprehensive analysis

22.6.5 Uncertainty Propagation

When parameters or initial conditions are uncertain, we need to propagate this uncertainty through the model to understand uncertainty in predictions (Rackauckas 2026). This is crucial for:

  • Intervention effect uncertainty: How uncertain are intervention predictions?
  • Counterfactual bounds: What are plausible ranges for counterfactual outcomes?
  • Decision-making: Making robust decisions under uncertainty

Uncertainty Sources:

  • Parameter uncertainty (epistemic): Uncertainty about parameter values
  • Measurement uncertainty (observational): Error in observations
  • Numerical uncertainty (computational): ODE solver errors
  • Model uncertainty (structural): Uncertainty about model structure

Methods:

  • Monte Carlo: Sample parameters, run model, analyse outputs (most general)
  • Linearisation: \(\text{Var}(f(x)) \approx J \Sigma J^T\) where \(J\) is Jacobian (fast, approximate)
  • Polynomial chaos: More accurate than linearisation (intermediate complexity)

Implementation with Measurements.jl:

# Find project root and include ensure_packages.jl
project_root = let
    current = pwd()
    while !isfile(joinpath(current, "Project.toml")) && !isfile(joinpath(current, "_quarto.yml"))
        parent = dirname(current)
        parent == current && break
        current = parent
    end
    current
end
include(joinpath(project_root, "scripts", "ensure_packages.jl"))

@auto_using DifferentialEquations CairoMakie

# Check if Measurements is available
measurements_available = false
try
    using Measurements
    measurements_available = true
catch
    println("Note: Measurements.jl not available. This example requires Measurements.jl.")
end

if measurements_available

# SIR model
function sir!(du, u, p, t)
    S, I, R = u
    β, γ = p
    du[1] = -β * S * I
    du[2] = β * S * I - γ * I
    du[3] = γ * I
end

# Parameters with uncertainty: β = 0.3 ± 0.05, γ = 0.1 ± 0.02
β = 0.3 ± 0.05
γ = 0.1 ± 0.02

u0 = [0.99 ± 0.0, 0.01 ± 0.0, 0.0 ± 0.0]  # Initial conditions (certain)
tspan = (0.0, 50.0)

# Solve ODE with uncertain parameters
# Measurements.jl automatically propagates uncertainty
prob = ODEProblem(sir!, u0, tspan, (β, γ))
sol = solve(prob, Tsit5(), saveat = 0.0:1.0:50.0)

# Extract mean and uncertainty
I_mean = [Measurements.value(u[2]) for u in sol.u]
I_uncertainty = [Measurements.uncertainty(u[2]) for u in sol.u]
I_upper = I_mean .+ I_uncertainty
I_lower = I_mean .- I_uncertainty

# Visualise
let
fig = Figure(size = (800, 400))
ax = Axis(fig[1, 1], title = "Uncertainty Propagation: Infected (I)",
          xlabel = "Time", ylabel = "Proportion")

# Uncertainty band
band!(ax, sol.t, I_lower, I_upper, color = (:blue, 0.2), label = "Uncertainty band")
lines!(ax, sol.t, I_mean, linewidth = 2, color = :blue, label = "Mean")
lines!(ax, sol.t, I_upper, linewidth = 1, color = :blue, linestyle = :dash, label = "Mean ± σ")
lines!(ax, sol.t, I_lower, linewidth = 1, color = :blue, linestyle = :dash)

axislegend(ax)

fig
end

# Report uncertainty at peak
peak_idx = argmax(I_mean)
peak_time = sol.t[peak_idx]
peak_value = I_mean[peak_idx]
peak_uncertainty = I_uncertainty[peak_idx]

println("Uncertainty propagation results:")
println("  Peak infection occurs at t = ", round(peak_time, digits=1))
println("  Peak value: ", round(peak_value, digits=4), " ± ", round(peak_uncertainty, digits=4))
println("  Relative uncertainty: ", round(peak_uncertainty / peak_value * 100, digits=1), "%")
else
    println("Skipping uncertainty propagation: Measurements.jl not available")
end
Uncertainty propagation results:
  Peak infection occurs at t = 27.0
  Peak value: 0.3036 ± 0.0906
  Relative uncertainty: 29.8%

Key Points:

  • Uncertainty propagation quantifies prediction uncertainty
  • Essential for understanding intervention effect uncertainty
  • Connects to counterfactual reasoning (see Counterfactual Dynamics: Alternative Trajectories)
  • Different methods trade off accuracy vs computational cost

22.7 Intervention Effects on Robustness

22.7.1 How Interventions Affect Robustness

Question: How do interventions affect system robustness?

Example: How does removing a keystone species affect ecosystem robustness?

Methods:

  • Intervention → structural change: Interventions modify system structure
  • Structural change → robustness: Modified structure has different robustness properties
  • Evaluate: Compare robustness before and after intervention

22.7.2 Implementation: Intervention Effects on Robustness

We can assess how interventions affect system robustness:

# Find project root and include ensure_packages.jl
project_root = let
    current = pwd()
    while !isfile(joinpath(current, "Project.toml")) && !isfile(joinpath(current, "_quarto.yml"))
        parent = dirname(current)
        parent == current && break
        current = parent
    end
    current
end
include(joinpath(project_root, "scripts", "ensure_packages.jl"))

@auto_using DifferentialEquations CairoMakie

# Example: Ecosystem with keystone species
# Original: X₁ (keystone) affects X₂ (other species)
# Intervention: Remove keystone (do(X₁ = 0))

function ecosystem_original!(du, u, p, t)
    """Ecosystem: keystone species X₁ affects other species X₂."""
    X₁, X₂ = u
    α, β, γ = p
    du[1] = α * X₁ * (1 - X₁)  # Keystone species
    du[2] = β * X₂ * (1 - X₂) + γ * X₁  # Other species depends on keystone
end

function ecosystem_intervened!(du, u, p, t)
    """Ecosystem: keystone removed via intervention do(X₁ = 0)."""
    X₁, X₂ = u
    α, β, γ = p
    # Intervention: do(X₁ = 0) - keystone removed
    du[1] = 0.0  # Keystone fixed at 0
    du[2] = β * X₂ * (1 - X₂)  # Other species no longer depends on keystone
end

α, β, γ = 0.2, 0.15, 0.1
p = (α, β, γ)  # Using tuple for better performance
u0 = [0.5, 0.3]  # Both species present
tspan = (0.0, 30.0)

# Solve both systems
prob_original = ODEProblem(ecosystem_original!, u0, tspan, p)
sol_original = solve(prob_original, Tsit5())

prob_intervened = ODEProblem(ecosystem_intervened!, [0.0, u0[2]], tspan, p)
sol_intervened = solve(prob_intervened, Tsit5())

# Visualise
let
fig = Figure(size = (1000, 400))
ax1 = Axis(fig[1, 1], title = "Keystone Species X₁", xlabel = "Time", ylabel = "Abundance")
ax2 = Axis(fig[1, 2], title = "Other Species X₂", xlabel = "Time", ylabel = "Abundance")

lines!(ax1, sol_original.t, [u[1] for u in sol_original.u],
       label = "Original", linewidth = 2, color = :blue)
lines!(ax1, sol_intervened.t, [u[1] for u in sol_intervened.u],
       label = "Intervened (removed)", linewidth = 2, color = :red, linestyle = :dash)
axislegend(ax1)

lines!(ax2, sol_original.t, [u[2] for u in sol_original.u],
       label = "Original", linewidth = 2, color = :blue)
lines!(ax2, sol_intervened.t, [u[2] for u in sol_intervened.u],
       label = "Intervened (keystone removed)", linewidth = 2, color = :red, linestyle = :dash)
axislegend(ax2)

    fig  # Only this gets displayed
end

# Compare robustness (stability of X₂)
final_X2_original = sol_original.u[end][2]
final_X2_intervened = sol_intervened.u[end][2]
change = final_X2_intervened - final_X2_original

println("Intervention effect on robustness:")
println("  Final X₂ (original): ", round(final_X2_original, digits=3))
println("  Final X₂ (keystone removed): ", round(final_X2_intervened, digits=3))
println("  Change: ", round(change, digits=3))
println("  System robustness: ", abs(change) < 0.1 ? "Maintained" : "Reduced",
        " (", abs(change) < 0.1 ? "small" : "large", " change in X₂)")
Intervention effect on robustness:
  Final X₂ (original): 1.454
  Final X₂ (keystone removed): 0.975
  Change: -0.479
  System robustness: Reduced (large change in X₂)

22.8 E-Values

When drawing causal conclusions from observational data, unmeasured confounding threatens validity. E-values provide a simple, interpretable measure of how strong unmeasured confounding would need to be to explain away an observed effect (VanderWeele and Ding 2017).

22.8.1 The E-Value Concept

The E-value is the minimum strength of association (on the risk ratio scale) that an unmeasured confounder would need to have with both treatment and outcome to fully explain away the observed effect. Formally, for an observed risk ratio \(RR\):

\[ E = RR + \sqrt{RR \cdot (RR - 1)} \]

Interpretation: If the E-value is large (e.g., \(E > 4\)), then any unmeasured confounder would need to be very strongly associated with both treatment and outcome to nullify the observed effect. Conversely, a small E-value (e.g., \(E \approx 2\)) suggests that modest unmeasured confounding could plausibly explain the result.

22.8.2 Advantages

  • Simple to compute: Requires only the observed risk ratio (or odds ratio approximation)
  • Scale-free: Interpretable across different outcome types
  • Widely applicable: Works for binary outcomes, survival outcomes, and continuous outcomes (with appropriate transformation)
  • Intuitive: Directly answers “how robust is my conclusion?”

22.8.3 Biological Example

In observational drug effect studies, researchers often cannot randomise treatment. Suppose an observational study finds that a drug reduces mortality risk with \(RR = 0.5\) (50% risk reduction). The E-value is:

\[ E = 0.5 + \sqrt{0.5 \cdot (0.5 - 1)} = 0.5 + \sqrt{-0.25} \]

For \(RR < 1\), the formula applies to \(1/RR\) (the protective effect). For \(RR = 0.5\), we use \(RR = 2\) (comparing exposed to unexposed): \(E = 2 + \sqrt{2 \cdot 1} = 2 + \sqrt{2} \approx 3.41\). An unmeasured confounder would need risk ratios of at least 3.41 with both treatment and outcome to explain away this protective effect.

22.9 Rosenbaum Bounds

Rosenbaum bounds provide a sensitivity analysis framework for matched observational studies (Rosenbaum 2002). When units are matched on observed covariates, the key question is: at what level of unmeasured confounding does the conclusion break?

22.9.1 The Γ Parameter

In a matched design, we assume that within matched pairs, treatment assignment is “as if random” given the matching variables. The sensitivity parameter Γ quantifies departure from this assumption:

  • Γ = 1: No unmeasured confounding; treatment assignment is random within pairs
  • Γ > 1: Unmeasured confounding; the odds of treatment assignment can differ by a factor of Γ between two units matched on observables

For a given Γ, we can compute bounds on the p-value or confidence interval. As Γ increases, the bounds widen; eventually, the null hypothesis may no longer be rejected.

22.9.2 Interpretation

  • Γ = 2: One unit could be twice as likely as its matched pair to receive treatment due to unmeasured factors
  • Γ = 4: One unit could be four times as likely
  • Conclusion robustness: Report the largest Γ for which the conclusion holds (e.g., “the effect remains significant for Γ ≤ 3”)

22.9.3 Connection to Propensity Score Methods

Rosenbaum bounds complement propensity score matching: the propensity score addresses measured confounding, whilst the Γ-sensitivity analysis addresses unmeasured confounding. Together, they provide a complete picture of robustness in observational studies.

22.10 Manski Partial Identification

When point identification fails—when we cannot uniquely determine a causal effect from the data—partial identification asks: what can we still learn? (Manski 2003)

22.10.1 The Worst-Case Principle

Without parametric assumptions or strong identification conditions, we may only be able to bound the causal effect. Manski bounds exploit the fact that we observe \((Y, D)\) but not the counterfactuals \(Y(0)\) and \(Y(1)\).

22.10.2 No-Assumptions Bounds

For the average treatment effect on the treated, with outcome \(Y \in [y_{\min}, y_{\max}]\) and treatment indicator \(D\):

\[ \mathbb{E}[Y(1) \mid D=1] \in \left[ \mathbb{E}[Y \mid D=1] P(D=1) + y_{\min} P(D=0), \; \mathbb{E}[Y \mid D=1] P(D=1) + y_{\max} P(D=0) \right] \]

These bounds are often wide but require no assumptions beyond the support of \(Y\).

22.10.3 Monotone Treatment Response

Shape constraints can tighten bounds. If we assume monotone treatment response (e.g., treatment cannot harm: \(Y(1) \geq Y(0)\)), the identified set narrows. Similar gains come from monotone instrumental variables or monotone selection.

22.10.4 Connection to Dynamical Systems

In state-space models and dynamical systems, interventional predictions often rely on latent state inference. When the latent structure is partially identified (e.g., multiple parameter sets fit the data equally well), Manski-style bounds apply: we can characterise the range of possible intervention effects rather than a single point estimate.

22.11 Sensitivity for Dynamical Systems

Sensitivity analysis in dynamical systems extends beyond parameter variation to questions of model misspecification and unmeasured confounding in causal inference from time-series data.

22.11.1 Sensitivity of Inferred Latent States

In state-space models, latent states \(X_t\) are inferred from observations \(Y_t\). Results are sensitive to:

  • Measurement model misspecification: Wrong observation equation \(h(X_t)\) biases state estimates
  • Process model misspecification: Wrong dynamics \(f(X_t, u_t)\) propagate error forward
  • Noise distribution assumptions: Non-Gaussian or correlated noise affects inference

22.11.2 Sensitivity of Interventional Predictions

When predicting intervention effects (e.g., \(do(X_t = x)\)), sensitivity arises from:

  • Unmeasured confounders: Variables affecting both treatment and outcome that are not in the model
  • Selection bias: Observed trajectories may not represent the population of interest
  • Parameter uncertainty: Even with correct structure, parameter estimates have uncertainty

22.11.3 Connection to ODE/SDE Robustness

The robustness of ODE and SDE parameters (discussed earlier in this chapter) connects to causal sensitivity: if small parameter changes dramatically alter interventional predictions, those predictions are fragile. Global sensitivity analysis (Sobol indices) identifies which parameters most affect intervention outcomes.

22.11.4 Implementation: E-Value for Observational Causal Effect

The following code computes an E-value for an observed risk ratio from an observational study:

"""
Compute the E-value for sensitivity to unmeasured confounding.
E-value: minimum strength of unmeasured confounder needed to explain away the effect.
Formula: E = RR + √(RR · (RR - 1)) for RR ≥ 1 (harmful effect)
For protective effects (RR < 1), use 1/RR.
"""
function e_value(RR::Real)
    RR  1 || error("For RR < 1 (protective effect), pass 1/RR instead")
    return RR + (RR * (RR - 1))
end

# Example: observational study finds drug increases risk, RR = 2.5
RR_observed = 2.5
E = e_value(RR_observed)
println("Observed risk ratio: ", RR_observed)
println("E-value: ", round(E, digits=2))
println("Interpretation: Unmeasured confounder would need RR ≥ ", round(E, digits=2),
        " with both treatment and outcome to explain away this effect")

# Example: protective effect RR = 0.4 (use 1/0.4 = 2.5)
RR_protective = 0.4
E_protective = e_value(1 / RR_protective)
println("\nProtective effect RR = ", RR_protective, " → E-value = ", round(E_protective, digits=2))
Observed risk ratio: 2.5
E-value: 4.44
Interpretation: Unmeasured confounder would need RR ≥ 4.44 with both treatment and outcome to explain away this effect

Protective effect RR = 0.4 → E-value = 4.44

22.12 World Context

This chapter addresses Imagining in the Dynamical world—what alternative dynamic behaviours are possible under different assumptions? Sensitivity and robustness analysis explore how system behaviour changes with parameter variations, structural changes, or different initial conditions.

22.13 Key Takeaways

  1. Parameter sensitivity: How dynamics change with parameter variations
  2. Initial condition sensitivity: How dynamics change with different initial conditions
  3. Structural sensitivity: How dynamics change with structural perturbations
  4. Robustness metrics: Measures of system stability and resilience
  5. Interventions affect robustness: Interventions modify system robustness properties
  6. E-values: Simple, interpretable measure of sensitivity to unmeasured confounding in observational studies
  7. Rosenbaum bounds: Sensitivity analysis for matched studies via the Γ parameter
  8. Manski partial identification: Bounds on causal effects when point identification fails
  9. Dynamical sensitivity: Latent state inference and interventional predictions are sensitive to model misspecification

22.14 Further Reading

  • VanderWeele and Ding (2017): E-values for sensitivity to unmeasured confounding
  • Rosenbaum (2002): Rosenbaum bounds and sensitivity analysis for matched studies
  • Manski (2003): Partial identification and bounds on causal effects
  • Strogatz (2014): Nonlinear Dynamics and Chaos
  • Khalil (2002): Nonlinear Systems
  • Saltelli et al. (2008): Global Sensitivity Analysis: The Primer — comprehensive treatment of Sobol indices and variance-based methods
  • Rackauckas et al. (2020): Universal differential equations and adjoint methods for efficient gradient computation
  • Rackauckas (2026): Parallel Computing and Scientific Machine Learning — comprehensive treatment of adjoint methods, global sensitivity analysis, and uncertainty quantification
  • SciMLSensitivity.jl: Adjoint sensitivity methods (https://sensitivity.sciml.ai/stable/)
  • GlobalSensitivity.jl: Sobol indices and global sensitivity analysis (https://github.com/SciML/GlobalSensitivity.jl)
  • Measurements.jl: Automatic uncertainty propagation (https://github.com/JuliaPhysics/Measurements.jl)
  • Advanced Dynamics: Regimes, Networks, and Resilience: Resilience and robustness
  • From Dynamical to Observable: Measurement and Actualisation: Transition to Observable world