Liquidity at Risk: A Measurement Approach within Banking Institutions
September 6, 2017 - Monocle Research Department
The liquidity and credit crisis of 2007 and 2008 left the global financial community concerned over liquidity risk. Suddenly, multi-national banking groups realised they had to broaden their understanding and measurement of risk beyond market, credit, and operational risk. They needed to be able to anticipate liquidity strains in the markets – and strategise contingent funding.
While Basel II guidelines had not adequately addressed liquidity risk, Basel III proposals aimed to address this, with what are viewed by many as onerous liquidity ratios that were introduced in 2015. The ratios present a conundrum for banks; they need to comply with them, while still maintaining a competitive funding structure. But it’s not the ratios themselves that bring value, it’s the data accuracy and precision they demand.
The validation required for liquidity risk models will actually mean banks can get both an holistic and granular view of their risks. For example, the proposals imply that banks will need to distinguish between different behavioural aspects of diverse customers within a particular product. That means enhanced risk management capability, as well as pricing and customer selection.
If banks want to achieve the ambitions of the Basel regulations – and create a single, integrated platform for liquidity risk management, pricing, capital management, and strategic customer selection, they’ll need to implement a data-centric, market-factor driven, liquidity risk management framework. A framework that integrates credit, market, interest rate, and liquidity risk into a consistent set of metrics. And, in order to get that integrated risk view, and look precisely at the contribution of different risks to the liquidity risk solution, banks need granular data differentiation.
For Monocle, that means a comprehensive measurement and management approach for deeper understanding of liquidity risk and its potential interaction with other risks. It’s an integrated way to treat Liquidity-at-Risk (LaR). The process enables stress and scenario testing under market crises, leading to quantification of levels of contingent funding. It also helps to find more optimal loan-to-deposit ratios by investigating reliance on the wholesale funding markets.
LaR is a framework that includes simulation of a large number of future cash flow profiles, by replicating the entire cash flow process under each circumstance of contractual cash flows, behavioural cash flows, growth in asset and liability sizes, and interest rate re-pricing of each position. Each simulation offers a picture of how the balance sheet and, ultimately, the cash inflows and outflows, may evolve under different scenarios. Ultimately, it assists banks in anticipating strains, and managing liquidity risk.
When we talk Liquidity Risk, we talk, simply, about “the ability to fund increases in assets and meet obligations as they come due, without incurring unacceptable losses”. When we drill down, liquidity risk encompasses both market liquidity risk; the risk that a position cannot be offset or eliminated without economic loss – and funding liquidity risk; the risk that cash flow and collateral needs cannot be met in the normal course of business.
Conventionally, liquidity risk has been managed and measured within the Asset and Liability Management (ALM) function. But then the strains in the wholesale funding markets in August 2007 and September 2008 highlighted the interrelationships between funding and market liquidity risk, funding liquidity risk and credit risks, funding concentration and liquidity risk, and the effects of reputation on liquidity risk. We now know that liquidity risk is consequential – and cannot be viewed in isolation. So, today, banks veer towards an integrated risk management approach, in which all risk types are measured inclusively.
Monocle’s LaR model quantifies credit, market, liquidity, and interest rate risk using a single set of underlying risk factors, allowing a bank to view various ‘future states of the world’ from an integrated risk perspective. It also allows the risk management function in a bank to isolate the impact of these risk types on the liquidity shortfall for a particular tenor, with a particular confidence level. The goal? A stable, robust metric for the measurement of a liquidity gap, which includes the impact of credit, market, and interest rate risk.
Why LaR Measurement is Crucial
When wholesale funding markets seized up, existing liquidity models failed. Why? Because they were predominantly based on point estimates, rather than LaR distributions. Point estimates cannot accommodate the scenario tests and stress testing necessary to assess a bank’s liquidity position under market extremes, except in an overly simplified manner. The result was that many organisations didn’t fully understand the speed and severity with which their liquidity position would deteriorate in these extreme markets. And banks were not prepared for deleveraging debt markets, in which there were more sellers of debt securities than buyers. The desperate need for liquidity forced many banks to issue debt at previously unthinkable spreads, and liquidate assets at previously unthinkable prices. But if they’d been able to anticipate or predict liquidity strains in the markets – and the contingent funding requirements – that desperation could have been halted.
A contingent funding requirement measure needs to look at the inter-relatedness between different risk types, and give a response on funding requirements with different levels of confidence. Of course, banks are not only funded via wholesale markets, but also by deposits; reflected in the bank’s overall loan-to-deposit ratio.
LaR is a more distribution-based approach to the measurement of liquidity risk, under a range of different market rate scenarios. It looks at the effect that these scenarios may have, not only on the wholesale funding markets, but also on the behavioural aspects of the overnight core to non-core deposit base, and gives banks substantially greater insight.
Who will particularly benefit from this approach? Banks that, in the past, relied too heavily on wholesale market funding, rather than retail and commercial deposit funding. Rapid liquidity deterioration was particularly severe at these organisations, with high loan-to-deposit ratios.
These banks will be able to focus on creating deposit products that have longer periods of limited redemptions, by analysing the differential behaviour of different client types. This will reduce reliance on the wholesale market, identify more ‘ideal’ loan-to-deposit ratios, and may well provide cheaper funding for the bank, affording them the ability to improve pricing.
All that is needed is a commitment to carefully monitoring the metrics, to enhance their understanding of funding volatility and the specific circumstances that could result in a sudden funding requirement. The bank will also be able to continuously monitor access to money markets as a function of evolving macroeconomic conditions.
This increased awareness will provide the breathing space needed to strategise alternative funding mechanisms – before any seizing up of liquidity markets. Ideally, it can also offer real-time identification of the order in which assets should be liquidated and made available for liquidation, depending on the expected duration and severity of liquidity strains.
So how does it work under each circumstance of contractual cash flows, behavioural cash flows, growth in asset and liability sizes, and interest rate re-pricing of each position?
Behavioural Modelling of Product Cash Flows
The LaR Model is derived from a distribution of simulated economic factors, which drive simulations of the bank’s balance sheet and results in simulated net cumulative cash flow (NCCF) profiles over the funding horizon. Distributions of the bank’s NCCF at different times are generated as the outcome of factors impacting the behaviouralisation of product cash flows over defined intervals.
The framework simulates many versions (say, 100 000) of future cash flow profiles by replicating the entire cash flow process under each circumstance of contractual cash flows, behavioural cash flows, growth in asset / liability sizes, and interest rate re-pricing of each position. Asset Liability Management (ALM) is generally responsible for predicting the value and timing of daily cash inflows and outflows – both on a contractual and a behavioural basis.
Contractually, the cash flows of assets, liabilities, and off-balance sheet items are known with relative certainty. Term loans, which follow standard amortising schedules, stipulate a monthly payment from the term loan customer. These cash flows can be disaggregated into interest and principal components for simulations of Net Interest Income (NII) and other measures of bank profitability.
There is, however, no world-wide standard on the contractual terms or conditions which may be imposed on a bank’s products. Cash flow functions must be customised to cater for the unique product types and payment structures a bank offers its customers.
The risk, across many product types, is that the cash flows expected under the contractual profile are different from actual, experienced cash flows. This is because of the options customers have to deviate from the initial terms and conditions of their account; Pre-Payment Risk (the risk that a customer will repay an asset before the contractual maturity date); Early-Redemption Risk (the risk that a client will withdraw a deposit before the contractual maturity date); or Rollover Risk (the risk that a liability will reach a maturity date, and a higher funding cost is demanded by the depositor to roll it over). But it could also be because of behavioural market impacts, such as credit risk, resulting in the cessation of a particular set of cash flows.
Behavioural modelling, using econometric techniques and statistical methods, can estimate future client behaviour, and transform the results of these models into predicted cash flows. It also includes statistical techniques used to predict the behaviouralised cash flows of portfolios of fluctuating products, such as savings and current accounts. Simply, it adjusts contractual cash flow calculations to reflect the most likely client behaviour in the future.
As a rule, behavioural models used by banks have typically been long-run averages of past behaviour, so they’re not sensitive to the prevailing economic environment, or potential future economic developments. But it is possible to directly relate the level of each behavioural risk to an economic factor. Let’s look at pre-payments, for example. We can create a statistical model which relates levels of pre-payments in housing loans to selected interest rates or interest rate changes. Logically, as interest rates decrease, banks and their competitors are able to reduce the level of interest charged to products such as housing loans. And, of course, offers of lower rates entice customers to refinance.
So we can estimate the level of prepayment on sub-portfolios, given the prevailing level and changes in key interest rates. Behavioural cash flow models address the full range of possible behavioural adjustments for each and every product, for diverse client types.
There are also other behavioural aspects that impact the assets and liabilities side of the balance sheet. In particular, banks tend to lend on an internal or ‘prime’ rate, which is adjusted non-linearly to prevailing market rates. And, much liquidity risk modelling has excluded the impact of business growth and budgeting and forecasting targets. What we need to do, then, is include interest rate sensitivity models and asset/liability portfolio growth models with a holistic consideration of liquidity risk behavioural models.
Interest Rate Sensitivity
Cash flow amounts are usually dependent on the amount of interest charged on/accrued to the asset/liability. One of the challenges faced in modelling future cash flows, is the fact that bank rates are often determined by the bank itself, rather than by the market. This is problematic when predicting the bank’s response to changes in market interest rates.
Our methods would estimate the relationship between a change in market rates, and the resulting change in internal lending and deposit rates. This is then input to a simulation to determine the level of interest charged on individual and corporate accounts, and therefore the value of projected cash flows.
For products with a reference rate based on an internal lending or deposit rate, rather than a market rate, Monocle has a methodology to predict future levels of internal lending and deposit rates, given the levels and changes in market rates.
Historically, there’s been a strong relationship between internal rates and market rates, but this may not always be the case. As market rates decrease, banks generally apply these ‘savings’ to their customers by reducing internal lending rates, to remain competitive. Similarly, as market rates rise, the banks pass these costs onto their customers by raising internal lending rates.
By looking at 5 to 10 years of market rates of all tenors, we can estimate the relationship between these rates and internal bank rates over the same period. This relationship can then be embedded within a statistical model which translates movements in market rates into a probability of change for internal rates. When the probability of change reaches a key value, the reference or prime rate is assumed to increase or decrease, depending on recent movements in market rates.
Since market rates are easier to simulate and project into the future, the model allows the bank to understand how each market rate scenario in a large simulation (typically 100 000 iterations) translates into changes in internal rates, resulting in a prediction of internal rates into the future. This can then be used for Net Interest Income (NII) and other profitability scenario analyses.
Asset/Liability Portfolio Growth Models
Typically a liquidity risk model is premised on a run-off basis, where assets and liabilities are not replaced as they reach maturity, or on a business-as-usual basis, where assets and liabilities are replaced as they mature. However, to achieve true business value, we need to understand at a product level, and at a client-type level, what the growth in a particular asset or liability will be. LaR is an ‘adjusted business-as-usual’ approach, in which a projected growth in each asset and liability portfolio, as a result of changes in economic factors, is taken into account.
As part of the simulation process, there is a series of techniques to predict the future growth in asset and liability values on a bank’s balance sheet. The models rely on the historical relationship between growth rates and interest rates, which can be combined in a multi-variable regression. Of course, we know that relationships between interest rates and growth levels have not always remained intact – particularly in the recent financial crisis. Market interest rates (interbank rates) have been near all-time lows for some time, which, under normal circumstances, would be a leading indicator of higher growth rates. But financial institutions have decreased their appetite for risk, and for extending credit, drastically reducing growth levels below what would normally be expected.
Because of this, growth models incorporate a “desirability” factor, which is a numerical indication of the bank’s appetite for extending credit, or for growing a particular product type. This “desirability” factor adjusts the output of the growth models, which are purely linked to interest rates.
Economic variables other than interest rates can also be taken into account, particularly those variables which are shown to be leading indicators, such as economic variables; inventories, sentiment indices, and money supply growth.
Combining Predictive Models in the LaR
LaR typically requires generating 10 000 to 100 000 simulations of underlying market factors over a one-year horizon. At a minimum, LaR will use simulated market interest rates as a driving economic factor, because interest rates usually show the strongest predictive power in behavioural modelling. The interest rate simulation produces 10 000 to 100 000 observations of possible future interest rate paths, along with paths for other market factors.
These interest rate paths, and other market factor paths, are used to drive the contractual cash flow models, the behavioural cash flow models, growth models, and interest rate re-pricing models. These models interact to provide a picture of how the balance sheet ‘evolves’ over time. New ‘synthetic’ accounts are created to compensate for predicted growth value, and cash flows are, in turn, calculated for these new accounts. The replication of the balance sheet is an image of how the future balance sheet may look.
One method we use is to hold interbank activity conducted by the bank constant over the funding horizon, to measure the bank’s reliance on the interbank market in times of heightened demand for liquidity. With the foregoing assumptions and processes in place, the simulation is run. At each month during the horizon, it’s possible to measure the liquidity gap on a point-in-time basis. Simply, the cash inflows and outflows are considered in isolation, to assess whether that month’s liquidity gap is positive or negative.
At the end of the one year horizon, for example, banks can calculate a distribution of possible one year Net Cumulative Cash Flows (NCCF). This is the sum of all twelve point-in-time liquidity gaps over the year, representing an accumulation of cash flow shortages, or excesses, from the beginning to the end of the year. At the 99.97 percent confidence level for example (consistent with a AA rating), LaR will be the 31st worst observation in this distribution.
LaR simulates the progression of a portfolio’s value through time – how cash flows should behave, how balance sheet values should change, and ultimately the cash inflows and outflows that result from these changes. It is known as a dynamic portfolio approach as the methodology assumes that the portfolio is constantly changing as loans mature and new loans are created. The goal of LaR is to produce 10 000 to 100 000 values for the NCCF at the desired horizon, which could be overnight (1 day), 7 days or monthly, from 1 month to 12 months.
What that does is make it possible to assess the maximum possible liquidity shortfall over a one year period, as well as at different points through the one year period, given that interbank activity has been held constant. It may even serve to mitigate against the probability of a severe funding shortfall arising in the first place.
The LaR framework, essentially, is about bespoke models, methods, and techniques that assist banks in managing their liquidity risk. So that we may be able to predict, anticipate and prepare for the next time we’re in an extreme market crisis; instead of fumbling around desperate for liquidity, wondering why we didn’t see it coming.