Skip to content
Things to consider
- classifying something as a model carries with it important consequences related to cost and productivity.
- procedures can be both expensive for the institution and onerous for the model owner.
- Model validations pull model owners away from their everyday work, adversely affecting productivity and, sometimes, quality of work.
- Too many unnecessary reports containing findings that are comparatively unimportant can bury enterprise risk managers and distract them from the most urgent findings.
Definition of a model
A quantitative method, system, or approach that applies statistical, economic, financial, or mathematical theories, techniques, and assumptions to process input data into quantitative estimates
- Models as have three components
- An information input component
- A processing component,
- A reporting component,
- Quantitative estimate implies a level of uncertainty about the outputs.
On Spreadsheet models
- Probably two types:
- a model that transforms inputs into quantitative estimates or
- a non-model spreadsheet that generates defined arithmetic results.
- Ask oneself if back-testing required to gauge the accuracy of the spreadsheet’s outputs
- Ask if the spreadsheet is simply applying a defined set of business rules (then not a model)
- Spreadsheets should be classified as models (and validated as such) when they produce forward-looking estimates that can be back-tested.
On vendor models
- Vendor documentation is not a substitute for model documentation:
- Documentation should cover:
- Discussion of the model’s purpose and specific application,
- Discussion of model theory and approach,
- Description of the model’s structure
- Identification of model limitations and weaknesses
- Comprehensive list of inputs and assumptions, including their sources
- Comprehensive list of outputs and reports
- List Model settings
- Description of testing
- Testing results should be requested of the vendor: (per OCC 2011-12) including performance monitoring and outcomes analysis,
- Model validators should attempt to replicate the results of these studies, where feasible
- Developmental evidence should be requested of the vendor: (per OCC 2011-12)
- Banks should have contingency plans for the vendor model
- Implementation of reliable challenger model (for contingency)
On general model documentation
- Model validations frequently seem to occur at the most inopportune moments for model owners.
- Documentation would ideally be prepared during periods of lower operational stress.
- Documentation follows these basic criteria:
- Identifies the model’s purpose,
- Comprehensively lists and justifies the model’s inputs and assumptions
- Describes the model’s overall theory and approach,
- Lays out the developmental evidence supporting choices
- Identifies the limitations of the model
- Explains how the model is controlled—who
- Comprehensively identifies and describes the model’s outputs, how they are used, and how they are tested
On Model validation
- Comprehensive model validations consist of three main components:
- Conceptual soundness
- Monitoring and benchmarking,
- Outcomes analysis and back-testing.
- Performance testing is the core of any model validation
- Sensitivity analysis
- Stress testing
- Validate model inputs based on:
- Complexity of inputs
- Manual manipulation of inputs from source system prior to input into model
- Reliability of source system
- Relative importance of the input to the model’s outputs (i.e., sensitivity)
On Benchmarking and Back-testing
- Model developers as best practice should always list any alternative methodologies, theories, or data which were omitted from the model’s final version.
- Back-testing measures a model’s outcome and accuracy against real-world observations,
- Benchmarking measures those outcomes against those of other models or metrics.
- Back-testing and Benchmarking should ideally be performed together
Latest notes archived