diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json index c927069..b56d78f 100644 --- a/dev/.documenter-siteinfo.json +++ b/dev/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.9.4","generation_timestamp":"2024-02-05T02:16:32","documenter_version":"1.2.1"}} \ No newline at end of file +{"documenter":{"julia_version":"1.9.4","generation_timestamp":"2024-02-06T06:59:01","documenter_version":"1.2.1"}} \ No newline at end of file diff --git a/dev/api/basic_queries/index.html b/dev/api/basic_queries/index.html index fc7a57b..e156c27 100644 --- a/dev/api/basic_queries/index.html +++ b/dev/api/basic_queries/index.html @@ -1,4 +1,4 @@ -Basic Queries · EasyModelAnalysis.jl

Basic Queries

Timeseries

Extrema

EasyModelAnalysis.get_min_tFunction
get_min_t(prob, sym)
-get_min_t(sol, sym)

Returns (t,min) where t is the timepoint where sym reaches its minimum min in the interval prob.tspan.

source
EasyModelAnalysis.get_max_tFunction
get_max_t(prob, sym)
-get_max_t(sol, sym)

Returns (t,max) where t is the timepoint where sym reaches its maximum max in the interval prob.tspan.

source
EasyModelAnalysis.plot_extremaFunction
plot_extrema(prob, sym)

Plots the solution of the observable sym along with showcasing time points where it obtains its maximum and minimum values.

source
EasyModelAnalysis.phaseplot_extremaFunction
phaseplot_extrema(prob, sym, plotsyms)

Plots the phase plot solution of the observable sym along with showcasing time points where it obtains its maximum and minimum values. plotsyms should be given as the tuple of symbols for the observables that define the axis of the phase plot.

source

Forecasting

EasyModelAnalysis.get_uncertainty_forecastFunction
get_uncertainty_forecast(prob, sym, t, uncertainp, samples)

Get the ensemble of time-series of state sym evaluated at times t for solutions with uncertain parameters specified according to the distributions in uncertainp. The distributions are specified in the form [sym1 => dist1, sym2 => dist2] where dist is a Distributions.jl distribution. Samples is the number of trajectories to run.

source
EasyModelAnalysis.get_uncertainty_forecast_quantilesFunction

getuncertaintyforecast_quantiles(prob, sym, t, uncertainp, samples, quants = (0.05, 0.95))

Get the ensemble of time-series of state sym evaluated at times t for solutions with uncertain parameters specified according to the distributions in uncertainp. The distributions are specified in the form [sym1 => dist1, sym2 => dist2] where dist is a Distributions.jl distribution. Samples is the number of trajectories to run.

Returns a tuple of arrays for the quantiles quants which defaults to the 95% confidence intervals.

source
+Basic Queries · EasyModelAnalysis.jl

Basic Queries

Timeseries

Extrema

EasyModelAnalysis.get_min_tFunction
get_min_t(prob, sym)
+get_min_t(sol, sym)

Returns (t,min) where t is the timepoint where sym reaches its minimum min in the interval prob.tspan.

source
EasyModelAnalysis.get_max_tFunction
get_max_t(prob, sym)
+get_max_t(sol, sym)

Returns (t,max) where t is the timepoint where sym reaches its maximum max in the interval prob.tspan.

source
EasyModelAnalysis.plot_extremaFunction
plot_extrema(prob, sym)

Plots the solution of the observable sym along with showcasing time points where it obtains its maximum and minimum values.

source
EasyModelAnalysis.phaseplot_extremaFunction
phaseplot_extrema(prob, sym, plotsyms)

Plots the phase plot solution of the observable sym along with showcasing time points where it obtains its maximum and minimum values. plotsyms should be given as the tuple of symbols for the observables that define the axis of the phase plot.

source

Forecasting

EasyModelAnalysis.get_uncertainty_forecastFunction
get_uncertainty_forecast(prob, sym, t, uncertainp, samples)

Get the ensemble of time-series of state sym evaluated at times t for solutions with uncertain parameters specified according to the distributions in uncertainp. The distributions are specified in the form [sym1 => dist1, sym2 => dist2] where dist is a Distributions.jl distribution. Samples is the number of trajectories to run.

source
EasyModelAnalysis.get_uncertainty_forecast_quantilesFunction

getuncertaintyforecast_quantiles(prob, sym, t, uncertainp, samples, quants = (0.05, 0.95))

Get the ensemble of time-series of state sym evaluated at times t for solutions with uncertain parameters specified according to the distributions in uncertainp. The distributions are specified in the form [sym1 => dist1, sym2 => dist2] where dist is a Distributions.jl distribution. Samples is the number of trajectories to run.

Returns a tuple of arrays for the quantiles quants which defaults to the 95% confidence intervals.

source
diff --git a/dev/api/data_fitting_calibration/index.html b/dev/api/data_fitting_calibration/index.html index 71638d4..260ea16 100644 --- a/dev/api/data_fitting_calibration/index.html +++ b/dev/api/data_fitting_calibration/index.html @@ -7,18 +7,18 @@ ]

then if datafit(prob, p, t, data), t must be length 3 and these values correspond to x(t[i]).

If datafit(prob, p, data), then the data must be a tuple of (t, timeseries), for example:

[
 x => ([1.0, 2.0, 3.0], [11.352378507900013, 11.818374125301172, -10.72999081810307])
 z => ([0.5, 1.5, 2.5, 3.5], [2.005502877055581, 13.626953144513832, 5.382984515620634, 12.232084518374545])
-]

where this means x(2.0) == 11.81...

source
EasyModelAnalysis.global_datafitFunction
global_datafit(prob, pbounds, t, data; maxiters = 10000)
+]

where this means x(2.0) == 11.81...

source
EasyModelAnalysis.global_datafitFunction
global_datafit(prob, pbounds, t, data; maxiters = 10000)
 global_datafit(prob, pbounds, data; maxiters = 10000)

Fit parameters p to data measured at times t.

Arguments

  • prob: ODEProblem
  • pbounds: Vector of pairs of symbolic parameters to vectors of lower and upper bounds for the parameters.
  • t: Vector of time-points
  • data: Vector of pairs of symbolic states and measurements of these states at times t.

Keyword Arguments

  • maxiters: how long to run the optimization for. Defaults to 10000. Larger values are slower but more robust.
  • loss: the loss function used for fitting. Defaults to EasyModelAnalysis.l2loss, with an alternative being EasyModelAnalysis.relative_l2loss for relative weighted error.

p does not have to contain all the parameters required to solve prob, it can be a subset of parameters. Other parameters necessary to solve prob default to the parameter values found in prob.p. Similarly, not all states must be measured.

Data Definition

The data definition is given as a vctor of pairs. If t is specified globally for the datafit, then those time series correspond to the time points specified. For example,

[
 x => [11.352378507900013, 11.818374125301172, -10.72999081810307]
 z => [2.005502877055581, 13.626953144513832, 5.382984515620634, 12.232084518374545]
 ]

then if datafit(prob, p, t, data), t must be length 3 and these values correspond to x(t[i]).

If datafit(prob, p, data), then the data must be a tuple of (t, timeseries), for example:

[
 x => ([1.0, 2.0, 3.0], [11.352378507900013, 11.818374125301172, -10.72999081810307])
 z => ([0.5, 1.5, 2.5, 3.5], [2.005502877055581, 13.626953144513832, 5.382984515620634, 12.232084518374545])
-]

where this means x(2.0) == 11.81...

source
EasyModelAnalysis.bayesian_datafitFunction
bayesian_datafit(prob, p, t, data)
+]

where this means x(2.0) == 11.81...

source
EasyModelAnalysis.bayesian_datafitFunction
bayesian_datafit(prob, p, t, data)
 bayesian_datafit(prob, p, data)

Calculate posterior distribution for parameters p given data measured at times t.

Data Definition

The data definition is given as a vctor of pairs. If t is specified globally for the datafit, then those time series correspond to the time points specified. For example,

[
 x => [11.352378507900013, 11.818374125301172, -10.72999081810307]
 z => [2.005502877055581, 13.626953144513832, 5.382984515620634, 12.232084518374545]
 ]

then if datafit(prob, p, t, data), t must be length 3 and these values correspond to x(t[i]).

If datafit(prob, p, data), then the data must be a tuple of (t, timeseries), for example:

[
 x => ([1.0, 2.0, 3.0], [11.352378507900013, 11.818374125301172, -10.72999081810307])
 z => ([0.5, 1.5, 2.5, 3.5], [2.005502877055581, 13.626953144513832, 5.382984515620634, 12.232084518374545])
-]

where this means x(2.0) == 11.81...

source
+]

where this means x(2.0) == 11.81...

source diff --git a/dev/api/sensitivity_analysis/index.html b/dev/api/sensitivity_analysis/index.html index d80b748..809578d 100644 --- a/dev/api/sensitivity_analysis/index.html +++ b/dev/api/sensitivity_analysis/index.html @@ -1,2 +1,2 @@ -Sensitivity Analysis · EasyModelAnalysis.jl

Sensitivity Analysis

EasyModelAnalysis.get_sensitivityFunction
get_sensitivity(prob, t, x, pbounds)

Returns the Sobol Indices that quantify the uncertainty of the solution at time t and observation x to the parameters in pbounds.

Arguments

  • t: The time of observation, the solution is stored at this time to obtain the relevant observed variable.
  • x: The observation symbolic expression or a function that acts on the solution object.
  • pbounds: An array with the bounds for each parameter, passed as a pair of parameter expression and a vector with the upper and lower bound.

Keyword Arguments

  • samples: Number of samples for running the global sensitivity analysis.

Returns

  • A dictionary with the first, second and total order indices for all parameters (and pairs in case of second order).
source
EasyModelAnalysis.create_sensitivity_plotFunction
create_sensitivity_plot(prob, t, x, pbounds)

Creates bar plots of the first, second and total order Sobol indices that quantify sensitivity of the solution at time t and state x to the parameters in pbounds.

See also get_sensitivity

source
create_sensitivity_plot(sensres, pbounds, total_only = false; kw...)

Creates bar plots of the first, second and total order Sobol indices from the result of get_sensitivity and pbounds.

See also get_sensitivity

source
+Sensitivity Analysis · EasyModelAnalysis.jl

Sensitivity Analysis

EasyModelAnalysis.get_sensitivityFunction
get_sensitivity(prob, t, x, pbounds)

Returns the Sobol Indices that quantify the uncertainty of the solution at time t and observation x to the parameters in pbounds.

Arguments

  • t: The time of observation, the solution is stored at this time to obtain the relevant observed variable.
  • x: The observation symbolic expression or a function that acts on the solution object.
  • pbounds: An array with the bounds for each parameter, passed as a pair of parameter expression and a vector with the upper and lower bound.

Keyword Arguments

  • samples: Number of samples for running the global sensitivity analysis.

Returns

  • A dictionary with the first, second and total order indices for all parameters (and pairs in case of second order).
source
EasyModelAnalysis.create_sensitivity_plotFunction
create_sensitivity_plot(prob, t, x, pbounds)

Creates bar plots of the first, second and total order Sobol indices that quantify sensitivity of the solution at time t and state x to the parameters in pbounds.

See also get_sensitivity

source
create_sensitivity_plot(sensres, pbounds, total_only = false; kw...)

Creates bar plots of the first, second and total order Sobol indices from the result of get_sensitivity and pbounds.

See also get_sensitivity

source
diff --git a/dev/api/threshold_interventions/index.html b/dev/api/threshold_interventions/index.html index 5be9776..595146a 100644 --- a/dev/api/threshold_interventions/index.html +++ b/dev/api/threshold_interventions/index.html @@ -1,3 +1,3 @@ -Threshold Interventions · EasyModelAnalysis.jl

Threshold Interventions

EasyModelAnalysis.optimal_threshold_interventionFunction
optimal_threshold_intervention(prob, [p1 = prob.p], p2, obs, threshold, duration; maxtime)

Arguments

  • p1: parameters for the pre-intervention scenario. Defaults to prob.p.
  • p2: parameters for the pose-intervention scenario.
  • obs: The observation symbolic expression.
  • threshold: The threshold for the observation.
  • duration: Duration for the evaluation of intervention.

Keyword Arguments

  • maxtime: Maximum optimization time. Defaults to 60.

Returns

  • opt_tspan: Optimal intervention time span.
  • (s1, s2, s3): Pre-intervention, intervention, post-intervention solutions.
  • ret: Return code from the optimization.
source
EasyModelAnalysis.optimal_parameter_intervention_for_thresholdFunction
optimal_parameter_intervention_for_threshold(prob, obs, threshold, cost, ps,
-    lb, ub, intervention_tspan, duration; ineq_cons = nothing, maxtime=60)

Arguments

  • prob: An ODEProblem.
  • obs: The observation symbolic expression.
  • threshold: The threshold for the observation.
  • cost: the cost function for minimization, e.g. α + 20 * β.
  • ps: the parameters that appear in the cost, e.g. [α, β].
  • lb: the lower bounds of the parameters e.g. [-10, -5].
  • ub: the upper bounds of the parameters e.g. [5, 10].
  • intervention_tspan: intervention time span, e.g. (20.0, 30.0). Defaults to prob.tspan.
  • duration: Duration for the evaluation of intervention. Defaults to prob.tspan[2] - prob.tspan[1].

Keyword Arguments

  • maxtime: Maximum optimization time. Defaults to 60.
  • ineq_cons: a vector of symbolic expressions in terms of symbolic parameters. The optimizer will enforce ineq_cons .< 0.

Returns

  • opt_p: Optimal intervention parameters.
  • (s1, s2, s3): Pre-intervention, intervention, post-intervention solutions.
  • ret: Return code from the optimization.
source
+Threshold Interventions · EasyModelAnalysis.jl

Threshold Interventions

EasyModelAnalysis.optimal_threshold_interventionFunction
optimal_threshold_intervention(prob, [p1 = prob.p], p2, obs, threshold, duration; maxtime)

Arguments

  • p1: parameters for the pre-intervention scenario. Defaults to prob.p.
  • p2: parameters for the pose-intervention scenario.
  • obs: The observation symbolic expression.
  • threshold: The threshold for the observation.
  • duration: Duration for the evaluation of intervention.

Keyword Arguments

  • maxtime: Maximum optimization time. Defaults to 60.

Returns

  • opt_tspan: Optimal intervention time span.
  • (s1, s2, s3): Pre-intervention, intervention, post-intervention solutions.
  • ret: Return code from the optimization.
source
EasyModelAnalysis.optimal_parameter_intervention_for_thresholdFunction
optimal_parameter_intervention_for_threshold(prob, obs, threshold, cost, ps,
+    lb, ub, intervention_tspan, duration; ineq_cons = nothing, maxtime=60)

Arguments

  • prob: An ODEProblem.
  • obs: The observation symbolic expression.
  • threshold: The threshold for the observation.
  • cost: the cost function for minimization, e.g. α + 20 * β.
  • ps: the parameters that appear in the cost, e.g. [α, β].
  • lb: the lower bounds of the parameters e.g. [-10, -5].
  • ub: the upper bounds of the parameters e.g. [5, 10].
  • intervention_tspan: intervention time span, e.g. (20.0, 30.0). Defaults to prob.tspan.
  • duration: Duration for the evaluation of intervention. Defaults to prob.tspan[2] - prob.tspan[1].

Keyword Arguments

  • maxtime: Maximum optimization time. Defaults to 60.
  • ineq_cons: a vector of symbolic expressions in terms of symbolic parameters. The optimizer will enforce ineq_cons .< 0.

Returns

  • opt_p: Optimal intervention parameters.
  • (s1, s2, s3): Pre-intervention, intervention, post-intervention solutions.
  • ret: Return code from the optimization.
source
diff --git a/dev/assets/Manifest.toml b/dev/assets/Manifest.toml index 352e572..d6af356 100644 --- a/dev/assets/Manifest.toml +++ b/dev/assets/Manifest.toml @@ -1869,9 +1869,9 @@ version = "0.5.5+0" [[deps.Optim]] deps = ["Compat", "FillArrays", "ForwardDiff", "LineSearches", "LinearAlgebra", "MathOptInterface", "NLSolversBase", "NaNMath", "Parameters", "PositiveFactorizations", "Printf", "SparseArrays", "StatsBase"] -git-tree-sha1 = "47fea72de134f75b105a5d4a1abe5c6aec89d390" +git-tree-sha1 = "d024bfb56144d947d4fafcd9cb5cafbe3410b133" uuid = "429524aa-4258-5aef-a3af-852621145aeb" -version = "1.9.1" +version = "1.9.2" [[deps.Optimisers]] deps = ["ChainRulesCore", "Functors", "LinearAlgebra", "Random", "Statistics"] @@ -2257,9 +2257,9 @@ version = "0.4.0+0" [[deps.Roots]] deps = ["Accessors", "ChainRulesCore", "CommonSolve", "Printf"] -git-tree-sha1 = "39ebae5b76c8cd5629bec21adfca78b437dac1e6" +git-tree-sha1 = "754acd3031a9f2eaf6632ba4850b1c01fe4460c1" uuid = "f2b01f46-fcfa-551c-844a-d8ac1e96c665" -version = "2.1.1" +version = "2.1.2" [deps.Roots.extensions] RootsForwardDiffExt = "ForwardDiff" @@ -2474,9 +2474,9 @@ version = "0.1.15" [[deps.Static]] deps = ["IfElse"] -git-tree-sha1 = "f295e0a1da4ca425659c57441bcb59abb035a4bc" +git-tree-sha1 = "b366eb1eb68075745777d80861c6706c33f588ae" uuid = "aedffcd0-7271-4cad-89d0-dc628f76c6d3" -version = "0.8.8" +version = "0.8.9" [[deps.StaticArrayInterface]] deps = ["ArrayInterface", "Compat", "IfElse", "LinearAlgebra", "PrecompileTools", "Requires", "SparseArrays", "Static", "SuiteSparse"] diff --git a/dev/examples/ASIR/0bccfb07.svg b/dev/examples/ASIR/8d9ca941.svg similarity index 95% rename from dev/examples/ASIR/0bccfb07.svg rename to dev/examples/ASIR/8d9ca941.svg index 2eb0efc..c10ebb2 100644 --- a/dev/examples/ASIR/0bccfb07.svg +++ b/dev/examples/ASIR/8d9ca941.svg @@ -1,56 +1,56 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/examples/ASIR/index.html b/dev/examples/ASIR/index.html index 4383986..0a6bc79 100644 --- a/dev/examples/ASIR/index.html +++ b/dev/examples/ASIR/index.html @@ -13,4 +13,4 @@ @named asir = ODESystem(eqs) prob = ODEProblem(asir, [], (0, 110.0)) sol = solve(prob) -plot(sol)Example block output +plot(sol)Example block output diff --git a/dev/examples/Carcione2020/58fb698d.svg b/dev/examples/Carcione2020/bc9d5ab7.svg similarity index 85% rename from dev/examples/Carcione2020/58fb698d.svg rename to dev/examples/Carcione2020/bc9d5ab7.svg index 97b567f..299072a 100644 --- a/dev/examples/Carcione2020/58fb698d.svg +++ b/dev/examples/Carcione2020/bc9d5ab7.svg @@ -1,174 +1,174 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/examples/Carcione2020/08fe10bf.svg b/dev/examples/Carcione2020/d8e58008.svg similarity index 89% rename from dev/examples/Carcione2020/08fe10bf.svg rename to dev/examples/Carcione2020/d8e58008.svg index cb4f044..6637b8d 100644 --- a/dev/examples/Carcione2020/08fe10bf.svg +++ b/dev/examples/Carcione2020/d8e58008.svg @@ -1,46 +1,46 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/examples/Carcione2020/index.html b/dev/examples/Carcione2020/index.html index 2528b74..e082a24 100644 --- a/dev/examples/Carcione2020/index.html +++ b/dev/examples/Carcione2020/index.html @@ -33,11 +33,11 @@ tspan = (0.0, 1.0) prob = ODEProblem(odesys, [], tspan, []) sol = solve(prob, Rodas5()) -plot(sol, idxs = Deceased)Example block output
pbounds = [
+plot(sol, idxs = Deceased)
Example block output
pbounds = [
     alpha => [0.003, 0.006],
     epsilon => [1 / 6, 1 / 2],
     gamma => [0.1, 0.2],
     mu => [0.01, 0.02],
     beta => [0.7, 0.9],
 ]
-create_sensitivity_plot(prob, 100.0, Deceased, pbounds; samples = 2000)
Example block output +create_sensitivity_plot(prob, 100.0, Deceased, pbounds; samples = 2000)Example block output diff --git a/dev/examples/SEIRHD/27c5df44.svg b/dev/examples/SEIRHD/27c5df44.svg deleted file mode 100644 index 4b0fec7..0000000 --- a/dev/examples/SEIRHD/27c5df44.svg +++ /dev/null @@ -1,44 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/dev/examples/SEIRHD/4d3fc6d0.svg b/dev/examples/SEIRHD/4d3fc6d0.svg new file mode 100644 index 0000000..6ff44a6 --- /dev/null +++ b/dev/examples/SEIRHD/4d3fc6d0.svg @@ -0,0 +1,44 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/examples/SEIRHD/5c422ab7.svg b/dev/examples/SEIRHD/cbff9347.svg similarity index 95% rename from dev/examples/SEIRHD/5c422ab7.svg rename to dev/examples/SEIRHD/cbff9347.svg index 65f99af..908f00d 100644 --- a/dev/examples/SEIRHD/5c422ab7.svg +++ b/dev/examples/SEIRHD/cbff9347.svg @@ -1,58 +1,58 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/examples/SEIRHD/index.html b/dev/examples/SEIRHD/index.html index 22aee46..2bfb312 100644 --- a/dev/examples/SEIRHD/index.html +++ b/dev/examples/SEIRHD/index.html @@ -18,28 +18,28 @@ seirhd = structural_simplify(seirhd) prob = ODEProblem(seirhd, [], (0, 110.0)) sol = solve(prob) -plot(sol)Example block output

Let's solve a few problems:

Provide a forecast of cumulative Covid-19 cases and deaths over the 6-week period from May 1 – June 15, 2020 under no interventions, including 90% prediction intervals in your forecasts. Compare the accuracy of the forecasts with true data over the six-week timespan.

get_uncertainty_forecast(prob, [cumulative_I], 0:100, [β₁ => Uniform(0.0, 1.0)], 6 * 7)
42-element Vector{Matrix{Float64}}:
- [0.0 0.009566741802214083 … 2.0136452654587336 2.0344873648765027]
- [0.0 0.009558646817621011 … 0.5191931032775697 0.5273668587777561]
- [0.0 0.009561468585545702 … 1.7635567201098303 1.7851277689673166]
- [0.0 0.009563059798818592 … 1.8963932570982136 1.917583652731696]
- [0.0 0.009570052750143096 … 2.0571769537098286 2.077889112880953]
- [0.0 0.009568431515465076 … 2.03933959160985 2.0601050069596125]
- [0.0 0.0095660462681354 … 1.999889960265334 2.0207730533421078]
- [0.0 0.009563664477594517 … 1.9264673125415095 1.947568780302643]
- [0.0 0.00955881917065538 … 0.6472010316336315 0.6587990685068493]
- [0.0 0.00956289618948655 … 1.8869191330971997 1.9081374460013514]
+plot(sol)
Example block output

Let's solve a few problems:

Provide a forecast of cumulative Covid-19 cases and deaths over the 6-week period from May 1 – June 15, 2020 under no interventions, including 90% prediction intervals in your forecasts. Compare the accuracy of the forecasts with true data over the six-week timespan.

get_uncertainty_forecast(prob, [cumulative_I], 0:100, [β₁ => Uniform(0.0, 1.0)], 6 * 7)
42-element Vector{Matrix{Float64}}:
+ [0.0 0.009560002554083073 … 1.433431391360124 1.455397676117388]
+ [0.0 0.009574873643125216 … 2.0899492204579935 2.1105634720064894]
+ [0.0 0.009573702672718678 … 2.0838344280374757 2.104466955405496]
+ [0.0 0.009571450434087206 … 2.0690945765948325 2.0897711408768385]
+ [0.0 0.009570376514822518 … 2.0601773880514385 2.0808805903287757]
+ [0.0 0.0095649011772127 … 1.9710772104733612 1.9920461537103233]
+ [0.0 0.009564371898726423 … 1.9541681867073502 1.9751874113270378]
+ [0.0 0.00955835267173154 … 0.35407197985553374 0.35778635956446647]
+ [0.0 0.009558263972282878 … 0.31811959063338335 0.32095066186369875]
+ [0.0 0.009568413510554603 … 2.0391101344900604 2.0598762343458783]
  ⋮
- [0.0 0.009572298354007207 … 2.0751864973583305 2.095844861590391]
- [0.0 0.009563605983305298 … 1.9238525517949534 1.944961766095559]
- [0.0 0.00957511435560049 … 2.091103116294189 2.1117139190501737]
- [0.0 0.00956576372785984 … 1.9935861767161782 2.0144880488555468]
- [0.0 0.009563534033207574 … 1.9205576030178977 1.941676575746298]
- [0.0 0.009568182169269803 … 2.0360891655661995 2.0568642774430153]
- [0.0 0.009577263723001272 … 2.1001368050540505 2.1207206030844215]
- [0.0 0.009572707432744583 … 2.077875592253804 2.0985259270923957]
- [0.0 0.009558831169572889 … 0.6567481876745785 0.6685897778377957]
plot_uncertainty_forecast(prob, [cumulative_I], 0:100, [β₁ => Uniform(0.0, 1.0)], 6 * 7)
get_uncertainty_forecast_quantiles(prob, [cumulative_I], 0:100, [β₁ => Uniform(0.0, 1.0)],
+ [0.0 0.009559809637274482 … 1.3491872631072357 1.3709588132727006]
+ [0.0 0.009577265110743825 … 2.100141991896574 2.1207257744187076]
+ [0.0 0.009570854355095802 … 2.064329165778125 2.0850199680091603]
+ [0.0 0.00956057334761673 … 1.6093721845391198 1.631290909765734]
+ [0.0 0.00957008807697862 … 2.057512168264181 2.0782233266084162]
+ [0.0 0.009573795802131406 … 2.0843537501698433 2.1049847246363527]
+ [0.0 0.009569392484193697 … 2.0505280123287712 2.0712600223266664]
+ [0.0 0.009562391166106021 … 1.852978895883284 1.8742966522670148]
+ [0.0 0.009573002609344208 … 2.079725111105425 2.1003699163325207]
plot_uncertainty_forecast(prob, [cumulative_I], 0:100, [β₁ => Uniform(0.0, 1.0)], 6 * 7)
get_uncertainty_forecast_quantiles(prob, [cumulative_I], 0:100, [β₁ => Uniform(0.0, 1.0)],
     6 * 7)
2-element Vector{Matrix{Float64}}:
- [0.0; 0.009559924557343148; … ; 1.4011112281971059; 1.4230211248960776;;]
- [0.0; 0.009577192281560298; … ; 2.09986804759008; 2.1204526491771563;;]
plot_uncertainty_forecast_quantiles(prob, [cumulative_I], 0:100, [β₁ => Uniform(0.0, 1.0)],
-    6 * 7)
Example block output

Based on the forecasts, do we need additional interventions to keep cumulative Covid deaths under 6000 total? Provide a probability that the cumulative number of Covid deaths will stay under 6000 for the next 6 weeks without any additional interventions.

+ [0.0; 0.009558996475327854; … ; 0.7924860422985878; 0.8074897171337921;;] + [0.0; 0.009576522399725996; … ; 2.097254456278247; 2.117846872310459;;]
plot_uncertainty_forecast_quantiles(prob, [cumulative_I], 0:100, [β₁ => Uniform(0.0, 1.0)],
+    6 * 7)
Example block output

Based on the forecasts, do we need additional interventions to keep cumulative Covid deaths under 6000 total? Provide a probability that the cumulative number of Covid deaths will stay under 6000 for the next 6 weeks without any additional interventions.

diff --git a/dev/examples/petri/4a2950e6.svg b/dev/examples/petri/44606d85.svg similarity index 92% rename from dev/examples/petri/4a2950e6.svg rename to dev/examples/petri/44606d85.svg index b830a02..d5f24cb 100644 --- a/dev/examples/petri/4a2950e6.svg +++ b/dev/examples/petri/44606d85.svg @@ -1,54 +1,54 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/examples/petri/index.html b/dev/examples/petri/index.html index 5d88660..5947962 100644 --- a/dev/examples/petri/index.html +++ b/dev/examples/petri/index.html @@ -172,4 +172,4 @@ sol = solve(prob) tmax, imax = get_max_t(prob, I) plt = plot(sol) -scatter!(plt, [tmax], [imax], lab = "Maximum infection", leg = :topright)Example block output +scatter!(plt, [tmax], [imax], lab = "Maximum infection", leg = :topright)Example block output diff --git a/dev/getting_started/71f8276d.svg b/dev/getting_started/71f8276d.svg deleted file mode 100644 index 9bfd7db..0000000 --- a/dev/getting_started/71f8276d.svg +++ /dev/null @@ -1,50 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/dev/getting_started/8ebb1ef9.svg b/dev/getting_started/8ebb1ef9.svg new file mode 100644 index 0000000..d01e925 --- /dev/null +++ b/dev/getting_started/8ebb1ef9.svg @@ -0,0 +1,50 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/getting_started/078a684d.svg b/dev/getting_started/a19bec32.svg similarity index 98% rename from dev/getting_started/078a684d.svg rename to dev/getting_started/a19bec32.svg index 12d269c..cfbbf9d 100644 --- a/dev/getting_started/078a684d.svg +++ b/dev/getting_started/a19bec32.svg @@ -1,48 +1,48 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/getting_started/index.html b/dev/getting_started/index.html index f2495b1..bd26332 100644 --- a/dev/getting_started/index.html +++ b/dev/getting_started/index.html @@ -69,7 +69,7 @@ [-19.39944052157736, -2.411052627709995, 11.88863957641005, 19.334495160672596]

EasyModelAnalysis.jl then makes it easy to do complex queries about the model with simple one-line commands. For example, let's evaluate the values of x at times [0.0, 1.0, 2.0]:

get_timeseries(prob, x, [0.0, 1.0, 2.0])
3-element Vector{Float64}:
   1.0
  11.352378507900013
- 11.818298591583103

That's too simple, so now let's grab the time points where x achieves its maximum and minimum:

xmin, xminval = get_min_t(prob, x)
(5.367888368520106, -18.631902241638347)
xmax, xmaxval = get_max_t(prob, x)
(98.69718398680142, 19.64868962568079)

Was that simple? Let's see what x looks like:

phaseplot_extrema(prob, x, (x, y))
Example block output
plot_extrema(prob, x)
Example block output

and boom, it grabbed the correct value in something that's relatively difficult. That's the core of EasyModelAnalysis!

Lazily Defining Observables

One thing to note about EasyModelAnalysis is that algebraically defined observables can be used in place of any state definition. As an example, let's say we wished to get the time series points for abs(x+y)^2. To do that, we would just make that be our observable:

get_timeseries(prob, abs(x + y)^2, [0.0, 1.0, 2.0])
3-element Vector{Float64}:
+ 11.818298591583103

That's too simple, so now let's grab the time points where x achieves its maximum and minimum:

xmin, xminval = get_min_t(prob, x)
(5.367888368882691, -18.631902241638347)
xmax, xmaxval = get_max_t(prob, x)
(98.69718398641588, 19.64868962568079)

Was that simple? Let's see what x looks like:

phaseplot_extrema(prob, x, (x, y))
Example block output
plot_extrema(prob, x)
Example block output

and boom, it grabbed the correct value in something that's relatively difficult. That's the core of EasyModelAnalysis!

Lazily Defining Observables

One thing to note about EasyModelAnalysis is that algebraically defined observables can be used in place of any state definition. As an example, let's say we wished to get the time series points for abs(x+y)^2. To do that, we would just make that be our observable:

get_timeseries(prob, abs(x + y)^2, [0.0, 1.0, 2.0])
3-element Vector{Float64}:
    1.0
   80.34060709018749
- 249.44063778749026

and similarly for extrema:

xmin, xminval = get_min_t(prob, abs(x + y)^2)
(58.389900089071226, 8.975625826719526e-9)

This applies to data fitting, sensitivity analysis, thresholds, and everything!

+ 249.44063778749026

and similarly for extrema:

xmin, xminval = get_min_t(prob, abs(x + y)^2)
(90.14042317796658, 2.2918065904571357e-25)

This applies to data fitting, sensitivity analysis, thresholds, and everything!

diff --git a/dev/index.html b/dev/index.html index 9b20472..ca24d4a 100644 --- a/dev/index.html +++ b/dev/index.html @@ -232,7 +232,7 @@ ⌅ [8913a72c] NonlinearSolve v1.10.1 [6fe1bfb0] OffsetArrays v1.13.0 [4d8831e6] OpenSSL v1.4.1 - [429524aa] Optim v1.9.1 + [429524aa] Optim v1.9.2 ⌃ [3bd65402] Optimisers v0.2.15 ⌃ [7f7a1694] Optimization v3.19.3 ⌅ [3e6eede4] OptimizationBBO v0.1.5 @@ -279,7 +279,7 @@ [ae029012] Requires v1.3.0 [ae5879a3] ResettableStacks v1.1.1 [79098fc4] Rmath v0.7.1 - [f2b01f46] Roots v2.1.1 + [f2b01f46] Roots v2.1.2 [7e49a35a] RuntimeGeneratedFunctions v0.5.12 [e5567a89] SBML v1.5.1 [86080e66] SBMLToolkit v0.1.25 @@ -306,7 +306,7 @@ [d4ead438] SpatialIndexing v0.1.6 [276daf66] SpecialFunctions v2.3.1 [171d559e] SplittablesBase v0.1.15 - [aedffcd0] Static v0.8.8 + [aedffcd0] Static v0.8.9 [0d7ed370] StaticArrayInterface v1.5.0 [90137ffa] StaticArrays v1.9.2 [1e83bf80] StaticArraysCore v1.4.2 @@ -497,4 +497,4 @@ [8e850b90] libblastrampoline_jll v5.8.0+0 [8e850ede] nghttp2_jll v1.52.0+1 [3f19e933] p7zip_jll v17.4.0+0 -Info Packages marked with ⌃ and ⌅ have new versions available. Those with ⌃ may be upgradable, but those with ⌅ are restricted by compatibility constraints from upgrading. To see why use `status --outdated -m`

You can also download the manifest file and the project file.

+Info Packages marked with ⌃ and ⌅ have new versions available. Those with ⌃ may be upgradable, but those with ⌅ are restricted by compatibility constraints from upgrading. To see why use `status --outdated -m`

You can also download the manifest file and the project file.

diff --git a/dev/scenarios/scenario1/index.html b/dev/scenarios/scenario1/index.html index 8772421..eec4bee 100644 --- a/dev/scenarios/scenario1/index.html +++ b/dev/scenarios/scenario1/index.html @@ -82,4 +82,4 @@ sviivr_lbn_pth = joinpath(@__DIR__, "CHIME_SVIIvR_dynamics_BiLayer.json") sviivr_lbn = read_json_acset(LabelledBilayerNetwork, sviivr_lbn_pth) sviivr = LabelledPetriNet() -migrate!(sviivr, sviivr_lbn)

Model Analysis

Question 3 Numerical Comparison

Compare simulation outputs between the three models, for the following two scenarios. Assume initial values and parameter values are consistent (to the extent possible) with Table 1 in https://biomedres.us/pdfs/BJSTR.MS.ID.007413.pdf. For initial values that are not specified, choose reasonable values and ensure they are the same between the three models being compared. i. Vaccine efficacy = 75%, population vaccinated = 10% ii. Vaccine efficacy = 75%, population vaccinated = 80%

E(0) = 99500 # exposed I(0) = 1 # infected recovered, deceased = 0 N = 10000000 mu = 0.012048 # death rate alpha = 0.00142 # fatality rate among unvaccinated alphav = 0.00142 # fatality rate among vaccinated betauu = 0.75 # probability of transmission per unvax contact * # of unvax contacts per time gamma^-1 = 3.31 # reciprocal of recovery rate of unvax gammav^-1 = 3.31 # " vax eps^-1 = 5.7 # reciprocal of rate of exposed,unvax => infectious,unvax epsv^-1 = 5.79 # " vax xi = 0.5 # vaccine efficacy kappa # fraction vaccinated

Run model 3ai

system = ODESystem(seirdnat_v) prob = ODEProblem(system, [10000000-99500, 99500, 1, 0, 0.0], [0, 100], [0.75, 1/5.7, 1/3.31, 0.012048, 1e-3, 1e-3, 1e-3, 1e-3])

Question 4

Create an equally weighted ensemble model using the three models in 3b, and replicate the scenarios in 3.c.i and 3.c.ii. How does the ensemble model output compare to the output from the individual component models?

Question 5

For any of the models in question 3, conduct a sensitivity analysis to determine which intervention parameters should be prioritized in the model, for having the greatest impact on deaths – NPIs, or vaccine-related interventions?

Question 6

With the age-stratified model, simulate the following situations. You may choose initial values that seem reasonable given the location and time, and you can reuse values from any of the publications referenced): i. High vaccination rate among older populations 65 years and older (e.g. 80%+), and low vaccination rate among all other age groups (e.g. below 15%) ii. High vaccination rate among all age groups iii. Repeat d.i and d.ii, but now add a social distancing policy at schools, that decreases contact rates by 20% for school-aged children only. iv. Compare and summarize simulation outputs for d.i-d.iii

+migrate!(sviivr, sviivr_lbn)

Model Analysis

Question 3 Numerical Comparison

Compare simulation outputs between the three models, for the following two scenarios. Assume initial values and parameter values are consistent (to the extent possible) with Table 1 in https://biomedres.us/pdfs/BJSTR.MS.ID.007413.pdf. For initial values that are not specified, choose reasonable values and ensure they are the same between the three models being compared. i. Vaccine efficacy = 75%, population vaccinated = 10% ii. Vaccine efficacy = 75%, population vaccinated = 80%

E(0) = 99500 # exposed I(0) = 1 # infected recovered, deceased = 0 N = 10000000 mu = 0.012048 # death rate alpha = 0.00142 # fatality rate among unvaccinated alphav = 0.00142 # fatality rate among vaccinated betauu = 0.75 # probability of transmission per unvax contact * # of unvax contacts per time gamma^-1 = 3.31 # reciprocal of recovery rate of unvax gammav^-1 = 3.31 # " vax eps^-1 = 5.7 # reciprocal of rate of exposed,unvax => infectious,unvax epsv^-1 = 5.79 # " vax xi = 0.5 # vaccine efficacy kappa # fraction vaccinated

Run model 3ai

system = ODESystem(seirdnat_v) prob = ODEProblem(system, [10000000-99500, 99500, 1, 0, 0.0], [0, 100], [0.75, 1/5.7, 1/3.31, 0.012048, 1e-3, 1e-3, 1e-3, 1e-3])

Question 4

Create an equally weighted ensemble model using the three models in 3b, and replicate the scenarios in 3.c.i and 3.c.ii. How does the ensemble model output compare to the output from the individual component models?

Question 5

For any of the models in question 3, conduct a sensitivity analysis to determine which intervention parameters should be prioritized in the model, for having the greatest impact on deaths – NPIs, or vaccine-related interventions?

Question 6

With the age-stratified model, simulate the following situations. You may choose initial values that seem reasonable given the location and time, and you can reuse values from any of the publications referenced): i. High vaccination rate among older populations 65 years and older (e.g. 80%+), and low vaccination rate among all other age groups (e.g. below 15%) ii. High vaccination rate among all age groups iii. Repeat d.i and d.ii, but now add a social distancing policy at schools, that decreases contact rates by 20% for school-aged children only. iv. Compare and summarize simulation outputs for d.i-d.iii

diff --git a/dev/scenarios/scenario2/adcf8692.svg b/dev/scenarios/scenario2/9e24673b.svg similarity index 85% rename from dev/scenarios/scenario2/adcf8692.svg rename to dev/scenarios/scenario2/9e24673b.svg index 11a4ea8..cdb3562 100644 --- a/dev/scenarios/scenario2/adcf8692.svg +++ b/dev/scenarios/scenario2/9e24673b.svg @@ -1,62 +1,62 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario2/b17687b2.svg b/dev/scenarios/scenario2/da750f62.svg similarity index 86% rename from dev/scenarios/scenario2/b17687b2.svg rename to dev/scenarios/scenario2/da750f62.svg index 1b23cac..0662d3f 100644 --- a/dev/scenarios/scenario2/b17687b2.svg +++ b/dev/scenarios/scenario2/da750f62.svg @@ -1,56 +1,56 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario2/index.html b/dev/scenarios/scenario2/index.html index 9e68715..a868a58 100644 --- a/dev/scenarios/scenario2/index.html +++ b/dev/scenarios/scenario2/index.html @@ -21,7 +21,7 @@ prob = ODEProblem(seirhd, [], (0.0, 60.0), saveat = 1.0) sol = solve(prob) u60 = sol[:, end] -plot(sol)Example block output

Model Analysis

Parameterize model either using data from the previous two months (October 28th – December 28th, 2021), or with relevant parameter values from the literature.

data = [I => sol[I], R => sol[R], H => sol[H], D => sol[D]]
+plot(sol)
Example block output

Model Analysis

Parameterize model either using data from the previous two months (October 28th – December 28th, 2021), or with relevant parameter values from the literature.

data = [I => sol[I], R => sol[R], H => sol[H], D => sol[D]]
 prior_mean = [0.06, 0.015, 0.005, 0.003, 0.007, 0.001, 0.2, 0.04]
 prior_sd = [0.006, 0.0015, 0.0005, 0.0003, 0.0007, 0.0001, 0.02, 0.004]
 p = [β₁, β₂, β₃, α, γ₁, γ₂, δ, μ]
@@ -40,7 +40,7 @@
   μ => [0.039770153626576536, 0.04031129302443778, 0.03938864239950999, 0.040017502339673675, 0.03993530873618308, 0.04045195014214249, 0.039952437370086524, 0.039961958776015306, 0.040115106841463785, 0.03920586986880458  …  0.03990442350630812, 0.040548271048057374, 0.04055600487213483, 0.040107408388911095, 0.03969660396539464, 0.04022614865784575, 0.03992044617493397, 0.040170497563029484, 0.040294704470686504, 0.039247157481951817]

Question 1

Forecast Covid cases and hospitalizations over the next 3 months under no interventions.

prob = remake(prob; u0 = u60,
     p = Pair.(getfield.(fit, :first), mean.(getfield.(fit, :second))))
 forecast_threemonths = solve(prob, tspan = (0.0, 90.0), saveat = 1.0)
-plot(forecast_threemonths)
Example block output

Question 2

Based on the forecast, do we need interventions to keep total Covid hospitalizations under a threshold of 3000 on any given day? If there is uncertainty in the model parameters, express the answer probabilistically, i.e., what is the likelihood or probability that the number of Covid hospitalizations will stay under this threshold for the next 3 months without interventions?

need_intervention = maximum(forecast_threemonths[H]) > 0.05
true
post_mean = mean.(getfield.(fit, :second))
+plot(forecast_threemonths)
Example block output

Question 2

Based on the forecast, do we need interventions to keep total Covid hospitalizations under a threshold of 3000 on any given day? If there is uncertainty in the model parameters, express the answer probabilistically, i.e., what is the likelihood or probability that the number of Covid hospitalizations will stay under this threshold for the next 3 months without interventions?

need_intervention = maximum(forecast_threemonths[H]) > 0.05
true
post_mean = mean.(getfield.(fit, :second))
 post_sd = sqrt.(var.(getfield.(fit, :second)))
 trunc_min = post_mean .- 3 * post_sd
 trunc_max = post_mean .+ 3 * post_sd
@@ -120,4 +120,4 @@
 min_intervention_strength.u
1-element Vector{Float64}:
  0.9074074074074074
res = zeros(91)
 g(res, min_intervention_strength.u)
-maximum(res[1:91])
0.049689990075674755

Question 5

Now assume that instead of NPIs, the Board wants to focus all their resources on an aggressive vaccination campaign to increase the fraction of the total population that is vaccinated. What is the minimum intervention with vaccinations required in order for this intervention to have the same impact on cases and hospitalizations, as your optimal answer from question 3? Depending on the model you use, this may be represented as an increase in total vaccinated population, or increase in daily vaccination rate (% of eligible people vaccinated each day), or some other representation.

This requires an additional model structure, not very interesting to showcase the SciML stack.

+maximum(res[1:91])
0.049689990075674755

Question 5

Now assume that instead of NPIs, the Board wants to focus all their resources on an aggressive vaccination campaign to increase the fraction of the total population that is vaccinated. What is the minimum intervention with vaccinations required in order for this intervention to have the same impact on cases and hospitalizations, as your optimal answer from question 3? Depending on the model you use, this may be represented as an increase in total vaccinated population, or increase in daily vaccination rate (% of eligible people vaccinated each day), or some other representation.

This requires an additional model structure, not very interesting to showcase the SciML stack.

diff --git a/dev/scenarios/scenario3/f16a3c18.svg b/dev/scenarios/scenario3/28c84c7a.svg similarity index 84% rename from dev/scenarios/scenario3/f16a3c18.svg rename to dev/scenarios/scenario3/28c84c7a.svg index adecc10..b8095f5 100644 --- a/dev/scenarios/scenario3/f16a3c18.svg +++ b/dev/scenarios/scenario3/28c84c7a.svg @@ -1,58 +1,58 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario3/8eb321a5.svg b/dev/scenarios/scenario3/e945b859.svg similarity index 82% rename from dev/scenarios/scenario3/8eb321a5.svg rename to dev/scenarios/scenario3/e945b859.svg index 2118f0d..87d4956 100644 --- a/dev/scenarios/scenario3/8eb321a5.svg +++ b/dev/scenarios/scenario3/e945b859.svg @@ -1,46 +1,46 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario3/a7c6b236.svg b/dev/scenarios/scenario3/f6c354a8.svg similarity index 95% rename from dev/scenarios/scenario3/a7c6b236.svg rename to dev/scenarios/scenario3/f6c354a8.svg index 3296993..cc6a229 100644 --- a/dev/scenarios/scenario3/a7c6b236.svg +++ b/dev/scenarios/scenario3/f6c354a8.svg @@ -1,58 +1,58 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario3/index.html b/dev/scenarios/scenario3/index.html index aba5854..0fa0853 100644 --- a/dev/scenarios/scenario3/index.html +++ b/dev/scenarios/scenario3/index.html @@ -110,7 +110,7 @@ ts = 0:tend prob = ODEProblem(sys, u0init, (0.0, tend)) sol = solve(prob) -plot(sol)Example block output

Model Analysis

Question 1

Provide a forecast of cumulative Covid-19 cases and deaths over the 6-week period from May 1 – June 15, 2020 under no interventions, including 90% prediction intervals in your forecasts. Compare the accuracy of the forecasts with true data over the six-week timespan.

get_uncertainty_forecast(prob, [accumulation_I], ts, [u_conv => Uniform(0.0, 1.0)], 6 * 7)
42-element Vector{Matrix{Float64}}:
+plot(sol)
Example block output

Model Analysis

Question 1

Provide a forecast of cumulative Covid-19 cases and deaths over the 6-week period from May 1 – June 15, 2020 under no interventions, including 90% prediction intervals in your forecasts. Compare the accuracy of the forecasts with true data over the six-week timespan.

get_uncertainty_forecast(prob, [accumulation_I], ts, [u_conv => Uniform(0.0, 1.0)], 6 * 7)
42-element Vector{Matrix{Float64}}:
  [0.0 0.06859915653154794 … 1.0912045083115203 1.1242136179045317]
  [0.0 0.06560553807764849 … 0.471991956655876 0.48293750527584006]
  [0.0 0.07934106518148466 … 4.073541914499148 4.189837629456124]
@@ -136,7 +136,7 @@
  [0.0; 0.06444283952144138; … ; 0.2777714689307762; 0.2826581909782595;;]
  [0.0; 0.08095812471648772; … ; 4.4934615539329235; 4.612847746732835;;]
plot_uncertainty_forecast_quantiles(prob, [accumulation_I], ts,
     [u_conv => Uniform(0.0, 1.0)],
-    6 * 7)
Example block output

Question 2

Based on the forecasts, do we need additional interventions to keep cumulative Covid deaths under 6000 total? Provide a probability that the cumulative number of Covid deaths will stay under 6000 for the next 6 weeks without any additional interventions.

_prob = remake(prob, tspan = (0.0, 6 * 7.0))
+    6 * 7)
Example block output

Question 2

Based on the forecasts, do we need additional interventions to keep cumulative Covid deaths under 6000 total? Provide a probability that the cumulative number of Covid deaths will stay under 6000 for the next 6 weeks without any additional interventions.

_prob = remake(prob, tspan = (0.0, 6 * 7.0))
 prob_violating_threshold(_prob, [u_conv => Uniform(0.0, 1.0)], [accumulation_I > 0.4 * NN]) # TODO: explain 0.4*NN
0.22614782187965526

Question 3

We are interested in determining how effective it would be to institute a mandatory mask mandate for the duration of the next six weeks. What is the probability of staying below 6000 cumulative deaths if we institute an indefinite mask mandate starting May 1, 2020?

_prob = remake(_prob, p = [u_expo => 0.02])
 prob_violating_threshold(_prob, [u_conv => Uniform(0.0, 1.0)], [accumulation_I > 0.4 * NN])
0.0

Question 4

We are interested in determining how detection rate can affect the accuracy and uncertainty in our forecasts. In particular, suppose we can improve the baseline detection rate by 20%, and the detection rate stays constant throughout the duration of the forecast. Assuming no additional interventions (ignoring Question 3), does that increase the amount of cumulative forecasted cases and deaths after six weeks? How does an increase in the detection rate affect the uncertainty in our estimates? Can you characterize the relationship between detection rate and our forecasts and their uncertainties, and comment on whether improving detection rates would provide decision-makers with better information (i.e., more accurate forecasts and/or narrower prediction intervals)?

# these new equations add I->D and H->R  to the model.
 # this says now, that all I are undetected and u_hosp is the detection rate.
@@ -163,7 +163,7 @@
 
 probd = ODEProblem(sys2_, u0init2, (0.0, tend))
 sold = solve(probd; saveat = ts)
-plot(sold)
Example block output
sols = []
+plot(sold)
Example block output
sols = []
 u_detects = 0:0.1:1
 for x in u_detects
     probd = remake(probd, p = [u_hosp => x])
@@ -202,4 +202,4 @@
     [u_conv => Uniform(0.0, 1.0)],
     6 * 7)
plot_uncertainty_forecast_quantiles(prob2, [accumulation_I], 0:100,
     [u_conv => Uniform(0.0, 1.0)],
-    6 * 7)

Do a 3-way structural model comparison between the SEIRD, SEIRHD, and SIRHD models.

#

https://github.com/SciML/EasyModelAnalysis.jl/issues/22

Question 7

What is the latest date we can impose a mandatory mask mandate over the next six weeks to ensure, with 90% probability, that cumulative deaths do not exceed 6000? Can you characterize the following relationship: for every day that we delay implementing a mask mandate, we expect cumulative deaths (over the six-week timeframe) to go up by X?

+ 6 * 7)

Do a 3-way structural model comparison between the SEIRD, SEIRHD, and SIRHD models.

#

https://github.com/SciML/EasyModelAnalysis.jl/issues/22

Question 7

What is the latest date we can impose a mandatory mask mandate over the next six weeks to ensure, with 90% probability, that cumulative deaths do not exceed 6000? Can you characterize the following relationship: for every day that we delay implementing a mask mandate, we expect cumulative deaths (over the six-week timeframe) to go up by X?

diff --git a/dev/scenarios/scenario4/0779418b.svg b/dev/scenarios/scenario4/0779418b.svg new file mode 100644 index 0000000..84894d1 --- /dev/null +++ b/dev/scenarios/scenario4/0779418b.svg @@ -0,0 +1,50 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario4/3112d506.svg b/dev/scenarios/scenario4/3112d506.svg new file mode 100644 index 0000000..f4d1efd --- /dev/null +++ b/dev/scenarios/scenario4/3112d506.svg @@ -0,0 +1,46 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario4/3343ea22.svg b/dev/scenarios/scenario4/3343ea22.svg deleted file mode 100644 index 1bb1549..0000000 --- a/dev/scenarios/scenario4/3343ea22.svg +++ /dev/null @@ -1,46 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/dev/scenarios/scenario4/878de623.svg b/dev/scenarios/scenario4/878de623.svg new file mode 100644 index 0000000..d71db98 --- /dev/null +++ b/dev/scenarios/scenario4/878de623.svg @@ -0,0 +1,48 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario4/521ace99.svg b/dev/scenarios/scenario4/c37e1e62.svg similarity index 99% rename from dev/scenarios/scenario4/521ace99.svg rename to dev/scenarios/scenario4/c37e1e62.svg index a33519b..dc3c62b 100644 --- a/dev/scenarios/scenario4/521ace99.svg +++ b/dev/scenarios/scenario4/c37e1e62.svg @@ -1,60 +1,60 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/scenarios/scenario4/cfd3c45f.svg b/dev/scenarios/scenario4/cfd3c45f.svg deleted file mode 100644 index 58d036f..0000000 --- a/dev/scenarios/scenario4/cfd3c45f.svg +++ /dev/null @@ -1,50 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/dev/scenarios/scenario4/d480b6aa.svg b/dev/scenarios/scenario4/d480b6aa.svg deleted file mode 100644 index d2d83b7..0000000 --- a/dev/scenarios/scenario4/d480b6aa.svg +++ /dev/null @@ -1,48 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/dev/scenarios/scenario4/index.html b/dev/scenarios/scenario4/index.html index d71bf3e..69b0173 100644 --- a/dev/scenarios/scenario4/index.html +++ b/dev/scenarios/scenario4/index.html @@ -71,12 +71,12 @@ prob = ODEProblem(sys, u0init, (0.0, tdays)) sol = solve(prob) -plot(sol)Example block output

Model Analysis

Question 1

Define a return-to-campus strategy that minimizes total testing while maintaining infections below the initial isolation bed capacity of 430. The testing scheme can include an arrival testing strategy in addition to unique testing approaches within time periods of the simulation. Cohorts can have unique testing strategies defined by test type and number per week.

# Minimize u_test subject to IS <= 430
+plot(sol)
Example block output

Model Analysis

Question 1

Define a return-to-campus strategy that minimizes total testing while maintaining infections below the initial isolation bed capacity of 430. The testing scheme can include an arrival testing strategy in addition to unique testing approaches within time periods of the simulation. Cohorts can have unique testing strategies defined by test type and number per week.

# Minimize u_test subject to IS <= 430
 p_opt, s2, ret = optimal_parameter_threshold(prob, IS, 430, u_test, [u_test], [0.0], [NN],
     maxtime = 10);
-plot(s2, idxs = [IS])
Example block output
p_opt, s2, ret = optimal_parameter_threshold(prob, D, 430, u_test, [u_test], [0.0], [NN],
+plot(s2, idxs = [IS])
Example block output
p_opt, s2, ret = optimal_parameter_threshold(prob, D, 430, u_test, [u_test], [0.0], [NN],
     maxtime = 10);
-plot(s2, idxs = [I, IS, D])
Example block output

Question 4

Challenge question: assume that antigen tests are one fifth the cost of PCR tests but also much less (~half) as sensitive. Incorporate the cost of the testing program into your recommendations.

# Minimize u_test subject to IS <= 430
+plot(s2, idxs = [I, IS, D])
Example block output

Question 4

Challenge question: assume that antigen tests are one fifth the cost of PCR tests but also much less (~half) as sensitive. Incorporate the cost of the testing program into your recommendations.

# Minimize u_test subject to IS <= 430
 p_opt, s2, ret = optimal_parameter_threshold(prob, IS, 430, 5 * u_test, [u_test], [0.0],
     [NN], maxtime = 10);
-plot(s2, idxs = [IS])
Example block output +plot(s2, idxs = [IS])Example block output diff --git a/dev/scenarios/scenario5/index.html b/dev/scenarios/scenario5/index.html index 4f3f441..72f5a71 100644 --- a/dev/scenarios/scenario5/index.html +++ b/dev/scenarios/scenario5/index.html @@ -1,2 +1,2 @@ -Scenario 5: Bucky model · EasyModelAnalysis.jl

Scenario 5: Bucky model

The original Bucky model is structured to handle population data stratified in 16 5-year bins, as described in the documentation. You’ve recently found a publication about an age-structured SIR model describing the spread of Covid in the state of Washington, USA, which was ‘ground zero’ of the Covid-19 pandemic in the United States, with the first confirmed case and first confirmed death in the country. (https://doi.org/10.1038/s41598-021-94609-3). Modify the Bucky model to use data stratified in 9 10-year bins, as shown in the age-contact matrix in Figure 1. Simulate the first 3 months of the Covid-19 pandemic in Washington, using the modified Bucky model.

+Scenario 5: Bucky model · EasyModelAnalysis.jl

Scenario 5: Bucky model

The original Bucky model is structured to handle population data stratified in 16 5-year bins, as described in the documentation. You’ve recently found a publication about an age-structured SIR model describing the spread of Covid in the state of Washington, USA, which was ‘ground zero’ of the Covid-19 pandemic in the United States, with the first confirmed case and first confirmed death in the country. (https://doi.org/10.1038/s41598-021-94609-3). Modify the Bucky model to use data stratified in 9 10-year bins, as shown in the age-contact matrix in Figure 1. Simulate the first 3 months of the Covid-19 pandemic in Washington, using the modified Bucky model.

diff --git a/dev/tutorials/datafitting/index.html b/dev/tutorials/datafitting/index.html index c3f23ec..e6833fc 100644 --- a/dev/tutorials/datafitting/index.html +++ b/dev/tutorials/datafitting/index.html @@ -72,4 +72,4 @@ z(t) => [8.983834702143962, 4.623642052356029, 6.172956575154105]

Now let's do a datafit. We need to choose initial parameters for the fitting process and call the datafit:

psub_ini = [σ => 27.0, β => 3.0]
 fit = datafit(prob, psub_ini, tsave, data)
2-element Vector{Pair{Num, Float64}}:
  σ => 28.000000012195933
- β => 2.66666667211938

Recall that our starting parameters, the parameters the dataset was generated from, was [σ => 28.0, ρ => 10.0, β => 8 / 3]. Looks like this did a good job at recovering them!

+ β => 2.66666667211938

Recall that our starting parameters, the parameters the dataset was generated from, was [σ => 28.0, ρ => 10.0, β => 8 / 3]. Looks like this did a good job at recovering them!

diff --git a/dev/tutorials/ensemble_modeling/index.html b/dev/tutorials/ensemble_modeling/index.html index 407b8c9..4926c99 100644 --- a/dev/tutorials/ensemble_modeling/index.html +++ b/dev/tutorials/ensemble_modeling/index.html @@ -123,4 +123,4 @@ scatter!(t_forecast, data_forecast[2][2][2])
ensem_prediction = sum(stack([ensem_weights[i] * sol[i][R] for i in 1:length(forecast_probs)]), dims = 2)
 plot(sol; idxs = R, color = :blue)
 plot!(t_forecast, ensem_prediction, lw = 3, color = :red)
-scatter!(t_forecast, data_forecast[3][2][2])
+scatter!(t_forecast, data_forecast[3][2][2]) diff --git a/dev/tutorials/probabilistic_thresholds/index.html b/dev/tutorials/probabilistic_thresholds/index.html index cf6c2de..02862b0 100644 --- a/dev/tutorials/probabilistic_thresholds/index.html +++ b/dev/tutorials/probabilistic_thresholds/index.html @@ -1,2 +1,2 @@ -Analysis of Threshold Crossing Probabilities Under Uncertainty · EasyModelAnalysis.jl +Analysis of Threshold Crossing Probabilities Under Uncertainty · EasyModelAnalysis.jl diff --git a/dev/tutorials/sensitivity_analysis/2a6c9708.svg b/dev/tutorials/sensitivity_analysis/d93990e5.svg similarity index 84% rename from dev/tutorials/sensitivity_analysis/2a6c9708.svg rename to dev/tutorials/sensitivity_analysis/d93990e5.svg index 7158c2d..828e8db 100644 --- a/dev/tutorials/sensitivity_analysis/2a6c9708.svg +++ b/dev/tutorials/sensitivity_analysis/d93990e5.svg @@ -1,99 +1,99 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + - + - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + - + - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/tutorials/sensitivity_analysis/index.html b/dev/tutorials/sensitivity_analysis/index.html index 8f440af..d53f589 100644 --- a/dev/tutorials/sensitivity_analysis/index.html +++ b/dev/tutorials/sensitivity_analysis/index.html @@ -72,4 +72,4 @@ :β_first_order => 0.0326214 :ρ_total_order => 0.988595 :β_total_order => 0.879228 - :ρ_β_second_order => 0.825519

The output shows values of first_order, second_order and total_order sensitivities. These are quantities that define the nonlinear effect of the variables on the output. The first order values can be thought of as the independent interaction effects, similar to the values of a linear regression for $R^2$, i.e. it's the variance explained by independent effects of a given parameter. The first indices are the scaled variance terms, for example:

ρ_first_order = V[y(100) changing only ρ] / V[y(100)]

where V denotes the variance. I.e., ρ_first_order is the percentage of variance explained by only changing ρ. Being normalized, if the model is linear then ρ_first_order + β_first_order == 1, and thus its total summation tells us the degree of nonlinearity. Our simulation here has the sum of first indices as <0.2, an indication of a high degree of nonlinear interaction between the measured output and the parameters.

The second order indices then say how much can be attributed to changes of combinations of variables. I.e.:

ρ_β_second_order = V[y(100) changing only ρ and β] / V[y(100)]

which thus gives the percentage of variance explained by the nonlinearities of ρ and β combined. These sensitivity functions only output up to the second indices, since there is a combinatorial explosion in the number of terms that need to be computed for models with more parameters. However, in this tutorial there are only two variables, and thus all variance should be explained by just these two parameters. This means that ρ_first_order + β_first_order + ρ_β_second_order should be approximately equal to 1, as all variance should be explained by the linearity or second order interactions between the two variables. Let's check this in action:

sensres[:ρ_first_order] + sensres[:β_first_order] + sensres[:ρ_β_second_order]
0.9656149538655133

This is not exactly equal to 1 due to the numerical error in the integral approximations, but you can see theory in action! (Also, this is a good test for correctness of the implementation).

Now, if you had more than two variables, is there a good way to get a sense of the “sensitivity due to ρ”? Using the first indices is a bad approximation to this value, since it's only the measurement of the independent or linear sensitivity of the output due to ρ. Instead, what one would want to do, is to get the sum of the first order index ρ_first_order, plus all second order effects which include ρ, plus all third order effects that include ρ, plus … all the way to the Nth order for N variables. Surprisingly, this summation can be computed without requiring the computation of all the higher order indices. This is known as the “total order index” of a variable. In a two parameter model, we can see this in action:

sensres[:ρ_first_order] + sensres[:ρ_β_second_order], sensres[:ρ_total_order]
(0.932993563491674, 0.9885950888540156)
sensres[:β_first_order] + sensres[:ρ_β_second_order], sensres[:β_total_order]
(0.8581400072304508, 0.879227712171329)

Thus, the total indices are a good measurement of the relative size of the total effect of each parameter on the solution of the model.

In summary:

  • First order indices showcase the amount of linearity and the direct linear attributions to each variable
  • The second order indices show the linear correlations in the outputs
  • The total indices measure the total effect a given variable has on the variance of the output

and notably, all values are normalized relative quantities.

Thus we can finally use the create_sensitivity_plot function to visualize the field of sensitivity results:

create_sensitivity_plot(prob, 100.0, y, pbounds; samples = 2000)
Example block output

which shows the relative sizes of the values in plots for the first, second, and total index values.

+ :ρ_β_second_order => 0.825519

The output shows values of first_order, second_order and total_order sensitivities. These are quantities that define the nonlinear effect of the variables on the output. The first order values can be thought of as the independent interaction effects, similar to the values of a linear regression for $R^2$, i.e. it's the variance explained by independent effects of a given parameter. The first indices are the scaled variance terms, for example:

ρ_first_order = V[y(100) changing only ρ] / V[y(100)]

where V denotes the variance. I.e., ρ_first_order is the percentage of variance explained by only changing ρ. Being normalized, if the model is linear then ρ_first_order + β_first_order == 1, and thus its total summation tells us the degree of nonlinearity. Our simulation here has the sum of first indices as <0.2, an indication of a high degree of nonlinear interaction between the measured output and the parameters.

The second order indices then say how much can be attributed to changes of combinations of variables. I.e.:

ρ_β_second_order = V[y(100) changing only ρ and β] / V[y(100)]

which thus gives the percentage of variance explained by the nonlinearities of ρ and β combined. These sensitivity functions only output up to the second indices, since there is a combinatorial explosion in the number of terms that need to be computed for models with more parameters. However, in this tutorial there are only two variables, and thus all variance should be explained by just these two parameters. This means that ρ_first_order + β_first_order + ρ_β_second_order should be approximately equal to 1, as all variance should be explained by the linearity or second order interactions between the two variables. Let's check this in action:

sensres[:ρ_first_order] + sensres[:β_first_order] + sensres[:ρ_β_second_order]
0.9656149538655133

This is not exactly equal to 1 due to the numerical error in the integral approximations, but you can see theory in action! (Also, this is a good test for correctness of the implementation).

Now, if you had more than two variables, is there a good way to get a sense of the “sensitivity due to ρ”? Using the first indices is a bad approximation to this value, since it's only the measurement of the independent or linear sensitivity of the output due to ρ. Instead, what one would want to do, is to get the sum of the first order index ρ_first_order, plus all second order effects which include ρ, plus all third order effects that include ρ, plus … all the way to the Nth order for N variables. Surprisingly, this summation can be computed without requiring the computation of all the higher order indices. This is known as the “total order index” of a variable. In a two parameter model, we can see this in action:

sensres[:ρ_first_order] + sensres[:ρ_β_second_order], sensres[:ρ_total_order]
(0.932993563491674, 0.9885950888540156)
sensres[:β_first_order] + sensres[:ρ_β_second_order], sensres[:β_total_order]
(0.8581400072304508, 0.879227712171329)

Thus, the total indices are a good measurement of the relative size of the total effect of each parameter on the solution of the model.

In summary:

  • First order indices showcase the amount of linearity and the direct linear attributions to each variable
  • The second order indices show the linear correlations in the outputs
  • The total indices measure the total effect a given variable has on the variance of the output

and notably, all values are normalized relative quantities.

Thus we can finally use the create_sensitivity_plot function to visualize the field of sensitivity results:

create_sensitivity_plot(prob, 100.0, y, pbounds; samples = 2000)
Example block output

which shows the relative sizes of the values in plots for the first, second, and total index values.

diff --git a/dev/tutorials/threshold_interventions/222a669b.svg b/dev/tutorials/threshold_interventions/222a669b.svg new file mode 100644 index 0000000..6b50283 --- /dev/null +++ b/dev/tutorials/threshold_interventions/222a669b.svg @@ -0,0 +1,54 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/tutorials/threshold_interventions/b3476d92.svg b/dev/tutorials/threshold_interventions/7a7b3218.svg similarity index 90% rename from dev/tutorials/threshold_interventions/b3476d92.svg rename to dev/tutorials/threshold_interventions/7a7b3218.svg index c6bf69c..02017cb 100644 --- a/dev/tutorials/threshold_interventions/b3476d92.svg +++ b/dev/tutorials/threshold_interventions/7a7b3218.svg @@ -1,48 +1,48 @@ - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/dev/tutorials/threshold_interventions/d37dfffd.svg b/dev/tutorials/threshold_interventions/d37dfffd.svg deleted file mode 100644 index 9cef61b..0000000 --- a/dev/tutorials/threshold_interventions/d37dfffd.svg +++ /dev/null @@ -1,54 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/dev/tutorials/threshold_interventions/index.html b/dev/tutorials/threshold_interventions/index.html index 4905b1e..49c731a 100644 --- a/dev/tutorials/threshold_interventions/index.html +++ b/dev/tutorials/threshold_interventions/index.html @@ -8,10 +8,10 @@ prob = ODEProblem(sys, [🐰 => 0.01], (0.0, 10.0), [p => 1.0])
ODEProblem with uType Vector{Float64} and tType Float64. In-place: true
 timespan: (0.0, 10.0)
 u0: 1-element Vector{Float64}:
- 0.01

In this model x(t) is the number of billions of bunnies (commonly referred to in the units bB) and t is given in units of months. Solving this ODE shows the number of bunnies grows exponentially over time as they are all fat and happy with plenty of grass to feed on:

plot(solve(prob))
Example block output

We can ask, how long does it take for this exponential growth to cause the continuous number of bunnies x to cross the value of 50bB? Let's ask:

get_threshold(prob, 🐰, 50)
8.517445010816738

Which, eyeballing the plot, looks correct: in about 8 and half months the population will reach 50 billion bunnies!

But now let's create an intervention. Let's say a corporation has created an “Earth viewing tourism” business where humans are watching the bunny population through a telescope. Market research has found that when there are more than 3bB in the population, the reaction goes from “that's cute” to “eww too many rodents”, and thus to maximize their profits the corporation wants to fire a laser that indiscriminately kills bunnies at an exponential rate (which just happens to be the same rate as twice the growth of the bunny population). But as you probably know from experience, operating death rays that cover the distance from Mars to Earth cost a lot to operate, so the corporation wants to run this for as short as possible. The company knows they will get shut down after 50 months anyway once regulators catch up, so they only need to keep the planet looking cute for that short amount of time.

How should the bunny murder operation commence in order to successfully maximize corporate profits? To get the desired result, we use the optimal_threshold_intervention function. Our intervention will be to decrease the growth rate $p = 1.0$ to $p = -1.0$ (i.e. decrease it by twice the birth rate). What we want to know is what is the minimal time intervention to keep the population under 3bB until a time 50. Thus, the call looks as follows:

opt_tspan, (s1, s2, s3), ret = optimal_threshold_intervention(prob, [p => -1.0], 🐰, 3, 50);
+ 0.01

In this model x(t) is the number of billions of bunnies (commonly referred to in the units bB) and t is given in units of months. Solving this ODE shows the number of bunnies grows exponentially over time as they are all fat and happy with plenty of grass to feed on:

plot(solve(prob))
Example block output

We can ask, how long does it take for this exponential growth to cause the continuous number of bunnies x to cross the value of 50bB? Let's ask:

get_threshold(prob, 🐰, 50)
8.517445010816738

Which, eyeballing the plot, looks correct: in about 8 and half months the population will reach 50 billion bunnies!

But now let's create an intervention. Let's say a corporation has created an “Earth viewing tourism” business where humans are watching the bunny population through a telescope. Market research has found that when there are more than 3bB in the population, the reaction goes from “that's cute” to “eww too many rodents”, and thus to maximize their profits the corporation wants to fire a laser that indiscriminately kills bunnies at an exponential rate (which just happens to be the same rate as twice the growth of the bunny population). But as you probably know from experience, operating death rays that cover the distance from Mars to Earth cost a lot to operate, so the corporation wants to run this for as short as possible. The company knows they will get shut down after 50 months anyway once regulators catch up, so they only need to keep the planet looking cute for that short amount of time.

How should the bunny murder operation commence in order to successfully maximize corporate profits? To get the desired result, we use the optimal_threshold_intervention function. Our intervention will be to decrease the growth rate $p = 1.0$ to $p = -1.0$ (i.e. decrease it by twice the birth rate). What we want to know is what is the minimal time intervention to keep the population under 3bB until a time 50. Thus, the call looks as follows:

opt_tspan, (s1, s2, s3), ret = optimal_threshold_intervention(prob, [p => -1.0], 🐰, 3, 50);
 
 opt_tspan
2-element Vector{Float64}:
-  5.155617350818222
- 27.822356713278392

The opt_tspan gives us the optimal timespan of the intervention: we should begin the decimation of bunny civilization when it reaches 5.15 months, and then we can turn the destruction device off at 27.8 months into our tourism operation. To see the effect of this, we can plot the results of the three sub-intervals:

plot(s1, lab = "pre-intervention")
+  5.152145712215946
+ 27.818516294222256

The opt_tspan gives us the optimal timespan of the intervention: we should begin the decimation of bunny civilization when it reaches 5.15 months, and then we can turn the destruction device off at 27.8 months into our tourism operation. To see the effect of this, we can plot the results of the three sub-intervals:

plot(s1, lab = "pre-intervention")
 plot!(s2, lab = "intervention")
-plot!(s3, xlims = (0, s3.t[end]), ylims = (0, 5), lab = "post-intervention", dpi = 300)
Example block output

Let's understand this result. Because of the exponential growth, we want to start the laser as late as possible as that will give us the most “bang for the buck”, or rather, “burn for the buck”, killing the most bunnies in the least amount of time. Thus, we want to wait for the population to grow a bit and be easier for the laser to hit. Then we leave the laser on for as short of a time as possible. By choosing 27.8 months as the off point, this will allow for the rest of the time to be tourism friendly. When regulators finally get in touch on the 50th month, the population will have exactly reached the 3bB tipping point, allowing us to fully use that entire time for unburned bunny tourism while also making regulators more sympathetic to our burning cause by the time they reach out to us.

Of course, this could also be used to uncover optimal policies for handling pandemics, but whatever your desired use case is, churn away.

+plot!(s3, xlims = (0, s3.t[end]), ylims = (0, 5), lab = "post-intervention", dpi = 300)Example block output

Let's understand this result. Because of the exponential growth, we want to start the laser as late as possible as that will give us the most “bang for the buck”, or rather, “burn for the buck”, killing the most bunnies in the least amount of time. Thus, we want to wait for the population to grow a bit and be easier for the laser to hit. Then we leave the laser on for as short of a time as possible. By choosing 27.8 months as the off point, this will allow for the rest of the time to be tourism friendly. When regulators finally get in touch on the 50th month, the population will have exactly reached the 3bB tipping point, allowing us to fully use that entire time for unburned bunny tourism while also making regulators more sympathetic to our burning cause by the time they reach out to us.

Of course, this could also be used to uncover optimal policies for handling pandemics, but whatever your desired use case is, churn away.