Good Contents Are Everywhere, But Here, We Deliver The Best of The Best.Please Hold on!
Your address will show here +12 34 56 78
Research

Model Management and Monitoring

Executive Summary

A model may be defined as a mathematical process, whether deterministic or stochastic in nature, which transfers raw data into a synthetic score upon which decisions can be made in the acquisition of new business, mitigation of risk, and in the planning of future scenarios.

 

A model should not be seen as a separate or distinct application within the organisation, but rather as a piece of intellectual property originating through specific underlying factors unique to that organisation. These factors include input and output data, statistical regressions and estimations (performance of the model), information for decision making (model output), personnel training and support (administrative information), and reporting (validation). Without proper management and monitoring of the model and all its underlying factors the model may produce inaccurate results, or become redundant in the organisational structure, resulting in a monetary loss.

 

This paper focuses on the threats to an organisation without proper model management, and the management of such threats across the model life-cycle.

 

Models are complex applications which affect the manner in which a corporation makes its day-to-day decisions, as well as the way in which the organisation reports to its regulator. Without proper model management and monitoring, a corporation may suffer monetary losses, either by basing decisions on incorrect information, or the impact on regulatory capital.

Model Related Threats to an Organisation

Various types of models are utilised within the day-to-day business of an organisation. In the context of the type of model, or the business motivation thereof, the output calculated or derived from the model will have an impact on management decisions. These decisions could involve introducing a new product, entering a new market, providing credit to a risky client, assessing the premium or interest charge on a product to a specific customer, or financial planning, among many others. Crucial business decisions based on inaccurate information may jeopardise the future of the organisation. Business models should be seen by the organisation as an investment of similar consequence to purchasing a new office building or even the acquisition of a new business. Investments such as these are under constant scrutiny to ensure that the required return is realised.

 

The initial drive to implement a business model in an organisation originates from those underlying factors which require fast reaction to achieve full benefit. Some examples include the need to comply with regulatory requirements, financial planning for assets or a business venture, monitoring of risky customers, and striving to outperform competitors in a particular business line. In the initial momentum behind such drives, best practice procedures with respect to the development, implementation, and maintenance of models is often an afterthought.

 

A prominent example is the immense range of challenges the banking industry has confronted in complying with regulatory standards. Three approaches exist within the Basel II directive for the calculation of a bank’s regulatory capital. The Standardised Approach uses simple risk weights and hence requires no statistical models. The Foundation Internal Ratings Based Approach (FIRB) requires probability of default models, while the Advanced Internal Ratings Based approach (AIRB) requires probability of default (PD), exposure at default (EAD), loss given default (LGD) and effective maturity (M) models across the clusters.

 

In order to derive the potential benefit of a reduced regulatory capital requirement using the AIRB calculation, numerous banks developed and implemented their models in an extremely short space of time. At that stage, credit modelling was a fairly new concept to banks, with the result that the necessary processes and structures were not put in place. Inevitably, this lack of processes and structures allowed inaccuracies, errors, and omissions to creep into the model environment over time, giving rise to potential capital penalties rather than the envisaged capital reduction. For an organisation to fully comprehend a model and its output, a few factors should be taken into account:

 

  • What was the initial purpose of the model?
  • At what stage is the model in its life cycle?
  • How was the model developed?
    • Who developed the model?
    • What methodologies / parameters does the model utilise to estimate its final output?
    • Is there a clear understanding of the data requirement of the model?
    • What data was utilised with the development and initial calibration of the model?
    • How was the model initially validated to the regulator? Could this validation be replicated to the regulator if required?
  • When last was the model calibrated? Could this calibration be replicated to determine its current parameters?
  • Who is responsible for the day-to-day operation of the model? Does this person have the necessary understanding / support to operate the model?
  • Is the data utilised by the model of the required quality? Is this data accessible in a controlled environment?
  • What is the change control procedure which a model should adhere to? Are these changes properly documented?
  • Is there any documentation available for the model?

 

This is merely some of the information which an organisation needs to fully comprehend to grasp the output and workings of a single model. If this information cannot be obtained, an extensive model analysis may be required. Poor model governance may result in a necessity to conduct a full scale analysis, whether by costly internal resources or external consultancy.

 

Many organisations operate numerous models across various divisions, each serving a different purpose, and with disparate mechanisms and integration lines into the organisation.

More alarming yet is the fact that many organisations operate numerous models across various divisions, each serving a different purpose, and with disparate mechanisms and integration lines into the organisation. If there are no structures and procedures according to which all these models are governed on a group consolidated level, an organisation will suffer the consequences of having implemented models which are or may become inaccurate and / or redundant.

 

With proper model management from the initial development of a model, all necessary information and data is immediately available to management to rectify any underlying issues, as well as providing the ability to comprehend the originating source of these issues. It should be performed in a structured manner through all the phases of the model’s lifecycle (Analysis, Development, Implementation, and Business As Usual (BAU)).

 

Management and Monitoring

In order to understand how a model is monitored and managed in a structured environment an understanding of the impacting factors is required. Four areas can be identified which need to be taken into consideration with the management of a model, namely:

 

  • Data Management
  • Model Performance Management
  • Model Monitoring and Reporting
  • Administrative Information Management

 

These areas should not be assessed and analysed separately, but rather as intertwined pieces which contribute to the final output of the model. If just one of these factors is neglected, model performance and model results may become questionable.

 

Data Management

The quality and availability of the relevant data determines the strength of a model. Without proper data the model will not be able to function at an optimal level. Data management contributes to the performance of a model. It should be performed throughout the model’s lifecycle and should include:

 

  • Development Data Information, which refers to the data used in the development of the model.
  • Calibration Data, which is used to calibrate the model.
  • Validation Data, which is used in the validation of the model to regulators.
  • Production Model Data, which is the data used in the day-today running of the model.

 

However, within each phase there are specific factors which need to be taken into account, as illustrated below:

 

 

Data Requirement – A clear understanding should be available of the data fields / attributes applicable to a model for both its input as well as its output.

 

Data Availability – All sources from which the model derives the relevant data should be known to the organisation. These data fields should be monitored to ensure that data is not removed / manipulated by other business lines in the organisation.

 

Data Accessibility and Control – This refers purely to the transfer of data between source, model, and data warehouse. It is vital that the data pulls through to its destination in the desired format and that no undesired manipulation occurs.

 

Data Archiving – This involves the storage of data utilised by the model. It is important to have proper version / timestamp control on each data set / sample utilised in the model.

 

Data Quality – Data Quality management should be an ongoing objective of the organisation. It forms part of the data governance of the organisation and should be maintained by all data users, custodians, and data owners. The quality of data is correlated with the performance of a model.

 

Model Performance Management

Transformation in the organisation and industry in which an organisation operates has a huge impact on the requirements of a specific model. In an environment characterised by transformation a model may have a limited lifespan in terms of its functionality, predictive power, and ultimately its value to the organisation. It is for this reason that model performance and significance should be constantly monitored.

 

Model performance management ensures and encourages the accuracy, robustness, and timeliness of a model to ultimately produce meaningful and valuable output. In order for a model to operate at an optimal level, standards and controls should be implemented and monitored on a consistent basis. An organisation should assess model performance on an ongoing basis to identify whether the model should be:

 

  • Recalibrated – This may enhance the predictive power of the parameters in the model.
  • Redeveloped – If certain variables / parameters / methodologies have become obsolete, the model developer may decide to replace the model with a more robust solution.
  • Replaced – A better suited or more robust solution has been identified to replace the current model.
  • Decommissioned – The need and outcome of a model has become redundant, therefore the model is decommissioned.

 

The performance assessment of a model is purely based on the statistical standards implemented, as well as the calibration thereof.

 

Statistical Standards

 

There should be a clear connection between the final result vs. the input data and the assumptions / methodologies to derive the result. Statistical standards refer to the manner in which the model was developed as well as the overall functioning of the model. It mainly involves the statistical techniques the model is based on and the parameters utilised within these methodologies.

 

Calibration

 

Calibration refers to the adjustments made to the parameters utilised within the model using statistical techniques and sample data to enhance the model’s optimal performance. Management of the calibration process should include identifying and archiving the sample data utilised, the statistical techniques employed, and the results implemented in the model from the calibration. An organisation should be able to replicate the calibration of a model.

 

Monitoring and Reporting

The previous section touched on the monitoring of the statistical standards and calibration of the model. In fact the input and output of the model should be subject to continual review. In particular regular sanity checks should be conducted on the output of a model so that infeasible results can be responded to promptly. If the monitoring process has to comply with set standards potential issues can be rapidly and easily identified through exception reporting.

 

The first instance of actual monitoring and reporting of a model is within the validation phase of the model. From a supervisory point of view validation is required to prove the robustness and suitability of the model in the overall framework of an organisation. Validation is purely the responsibility of the organisation, but must be reported to its supervisor or regulator. Although the validation is mainly required for regulatory purposes, an organisation should implement frequent validations to ensure that the model is functioning as per its initial assessment and purpose.

 

Information Management

Analysis of an issue requires a basic background understanding. This often proves impossible given the administrative information supporting a model. Administrative information refers to amongst others, documentation stipulating the methodologies, business process documentation, user manuals, contact personnel, important dates, version control, and software and technology information. This information is related to the operation of the model within the organisation. All administrative information should be seen as a crucial part in the assessment and monitoring of a model at all phases of the model’s lifecycle.

 

Model Environment

With the appropriate measures in place for the management of individual models, the question can be addressed as to how an organisation can centrally consolidate all its implemented models? It is recommended that this be obtained by creating a model information and assessment environment at group consolidation level. This environment should cater for accessing the model, its current and historical data, the administrative information database, and reporting framework.

 

 

The purpose of a model environment is to ease the use and maintenance of a model for all model users and owners. It should provide a single interface where all activities and information pertinent to a model can be retrieved. A model friendly framework will enable an organisation to monitor its models on a day-to-day basis. Ongoing monitoring of the models assists users by gaining a better understanding of the models they utilise, identifying any unforeseen issues or constraints, and, if managed correctly, enhancing not only the performance of the models, but also the quality of the data they utilise.

 

Conclusion

Implementing world class models in an organisation, whether for decision-making or regulatory purposes, is only the start of model management. Providing adequate support to each model through a structured framework, and thereafter centrally consolidating the management of models in a model environment, not only improves model performance, but also creates the capability to identify lurking problems which under normal circumstances would go overlooked.

 

Monocle has extensive experience in assisting organisation around the world in the areas of model development, model validation, and model management.