Section 1 Asking the right question
Agreeing what the question is
1.1 Understanding the problem
Clarify what the real question is
1.1.1 It is important to ensure that the question is correctly framed to address the problem. For example, are we really interested in ‘how much money will my policy save?’, or should we be asking ‘what is the likelihood that this policy would save more than £x?’.
1.1.2 As well as clearly identifying the overarching question to answer, we should also ensure that any sub-questions to support the analysis are appropriately thought through. For example - in an education context we may be interested in uncertainty on both an academic and a financial year basis.
Appropriate use of outputs
1.2 How will output ranges be used?
Discuss how the outputs will be used
1.2.1 Will the ultimate decision be made solely on the basis of the uncertainty analysis, or is it a part of the bigger picture informing the decision? The more influential the uncertainty analysis is, and the more that hinges on the decision, the more important it is to build in robust uncertainty analysis.
1.2.2 If the output is to be fed into ‘downstream’ models, then it is important to understand the requirements of the downstream model. For example, if scenarios are used to illustrate uncertainty, then these may not be suitable for a ‘downstream’ Monte Carlo simulation.
Are there dependent models drawing on the analysis?
Incorporating uncertainty into analysis
1.3 What information do we need from the decision maker to appropriately incorporate uncertainty analysis?
Explain how the uncertainty can be used to better inform decisions
1.3.1 Help the decision maker to understand how any information around the uncertainty of analysis supplied can be used to support a better informed decision. Avoid giving misleading assurances on what the uncertainty analysis can achieve. It is unlikely that all sources of uncertainty will be quantifiable and/or with robust underpinning evidence.
1.3.2 There are many techniques that can be used to model uncertainty, not all of which will be appropriate for a given piece of analysis. A poor choice of technique may give misleading results. If there is a high degree of uncertainty, more detailed techniques may be misleading or imply spurious accuracy.
Avoid misleading results or spurious accuracy by choosing the appropriate technique
1.3.3 Discuss with the decision maker what level of uncertainty is acceptable. Do they want to know how wrong the forecast would need to be in order to change or rethink the policy? Or are they simply interested in an output “range”? If so, what does that “range” actually mean?
Check the policy maker’s risk appetite and how uncertainty will inform their decision
1.3.4 Consider the purpose of the analysis, the decision being made and how the analysis is intended to be used. Your analysis may vary depending on whether this is a policy, operational or financial decision.
The uncertainty analysis may differ depending on the type of decision that is being made
1.3.5 It may help to look at the problem with the decision maker using an example: an answer to the question of how much a policy would save may be £3m, with uncertainty analysis giving a broad range of £0.5-5.5m. The analyst should discuss with the decision maker how they want to be able to frame the analysis, for example:
“A range of £0.5-5.5m”, or
“The estimated savings are £3m, with analysis showing a 90% likelihood that savings will be between £1-5m”, or
“Analysis shows that there is an 80% likelihood that the savings will be greater than £2m”, or
“The policy needs to have X amount of takeup in order to break even”
Discuss examples of how the decision maker may want to think about uncertainty
1.3.6 For an operational decision, is it important to quantify the impact of the uncertainty (i.e. if we’re out by 1,000, this could cost us an additional…)? While an in-depth assessment of uncertainty may be useful when informing policy decisions, it may be unnecessary when the analysis is being used for high level monitoring.
1.3.7 It may be that operational decision makers do not want to see a range of results, but instead want to plan to a certain level of confidence, such as 65% or 95% rather than 50%. For example, when planning the number of schools, prison places or GPs we’ll need over the next 5 years, it may be more appropriate to plan to a higher level of confidence than 50%.
1.3.8 For financial decisions, decision makers may be interested in understanding the likelihood of receiving a certain level of income, or that the risks and opportunities materialise. However, the analysis would need to go hand in hand with financial risk management to mitigate the risks materialising or crystallise the opportunities.
1.3.9 Your analysis could be used to provide advice to customers on how a system is driven, and how each of the parameters interacts and impacts on the final output. Through scenario/sensitivity testing, you can find the dependencies and significances of the model parameters, interpreting how these affect the final output.
Your analysis could be used to provide advice to customer on how a system is driven or how to manage/plan resources