The science of IFRS 9 and the art of Basel: Use of parametric thinking in provisioning

SUITS THE C-SUITE By Christian G. Lauron

Business World (07/02/2018 – p.S1/4)

(Last of three parts)

IFRS 9 is an International Financial Reporting Standard (IFRS) promulgated by the International Accounting Standards Board on July 24, 2014. It addresses the accounting for financial instruments and features three main topics: classification and measurement of financial instruments; impairment of financial assets; and hedge accounting. It became effective on Jan. 1, 2018 and has replaced International Accounting Standards (IAS) 39 Financial Instruments: Recognition and Measurement and all previous versions of IFRS 9. In this article, IFRS 9 is referred to as a “science” because of its systematically organized body of information and measurements on specific topics.

Basel III (or the Third Basel Accord or Basel Standards) is a global, voluntary regulatory capital and liquidity framework agreed upon by the members of the Basel Committee on Banking Supervision (BCBS) in 2010–11. It was scheduled to be introduced from 2013 until 2015; however, the implementation has been extended to March 31, 2019. Another round of changes was agreed upon in 2016 and 2017 (informally referred to as Basel IV) and the BCBS is proposing a nine-year implementation timetable, with a “phase-in” period to commence in 2022 and full implementation to be expected by 2027. Basel III was developed in response to the deficiencies in financial regulation that came to light after the financial crisis of 2007–08. Basel III is intended to strengthen banks’ capital requirements, liquidity, maturity profile, and leverage. It also introduced macroprudential elements and capital buffers designed to improve the banking sector’s ability to absorb shocks from financial and economic stress; and reduce spillover effects from the financial sector to the real economy. Basel is an “art” form in the context of the need to perform skillful planning and creative visualization in fully comprehending its dynamic processes and uncertainties.

The timing of BSP Circular 989 and the adoption of IFRS 9 will be of equal interest to regulators as well as boards and management, who are keen on understanding the impact on financial institutions’ (FIs) loss-absorbing capacity under stressful conditions and implications for macro-prudential policy on one hand, and the strengthening of strategic plans on the other. FIs are expected to deal with expected credit loss (ECL) and time series data sets and calculation templates at granular and portfolio levels and draw upon multiple scenarios using their own expanded methodologies. They will need to achieve clarity on which would be considered base case and stressful scenarios, in order to help establish the range that would feed into the overlay mechanism of IFRS 9. When this development happens, the top-down and bottom-up approaches to adjusting Probability of Default (PD) for the overlay mechanism will become manifested in the coming months, so it is helpful to understand these two simultaneous processes that may converge (with the corporate and institutional exposures in mind).

Under the bottom-up approach, the PD is determined from the base credit risk model that accounts for idiosyncratic properties before it is adjusted for industry-level factors. The final adjustment is the overlay of the macroeconomic scenarios. Methodology-wise, this process involves recalibrating the rating or scoring PD models to incorporate macroeconomic factors. In practice, and for communication purposes, it would be helpful to distinguish the base PD and the corresponding overlay adjustment, which could be illustrated as a scalar or multiplier of 1 to 1.2 given an intense view of the economy.

On the other hand, the top-down approach is influenced by macroeconomic modeling that may involve auto-regression and would use a combination of an underlying Basel PD model and a portfolio model associated with stress testing. This Basel PD model produces a long-run or through the cycle PD that requires scaling, such that the portfolio average PD matches the predicted PD from the stress testing model. Forward-looking macroeconomic factors are applied in this exercise, with a scalar derived through optimization when linking the two models. In addition to regression, single factor models and credit index approaches may also be employed for top-down approaches.

Currently we are observing more bottom-up approaches being employed by the industry as it improves its base credit risk models and the relevant industry factors given the segment of the exposure. With the introduction of BSP Circular 989, we would expect top-down approaches to be revisited.

At some point within the next 12 to 15 months, we would expect a ‘VaR to VAR’ methodology connection between IFRS 9 and stress testing. From Value at Risk models to Vector Autoregressive models and back, this development could usher in the second generation of overlay and stress testing models that would allow economic forecasting (and potentially reduce the probability-weighting exercise to a sense-check exercise rather than as the main input) and incorporate lifetime and transition criteria. Regardless of the advancement to be implemented by financial institutions, there are helpful operational guidelines to be noted. The first point is that the exercise could result in an unintended front loading of losses, leading to capital erosion. The second is for any stress testing methodology and overlay mechanisms to be connected to internal risk management, notwithstanding the regulatory floors that may be imposed for capital adequacy purposes. The final point is for the Board to be directly involved in the identification and evaluation of stress scenarios, the stress test interrelationship map and oversight on the macroeconomic projections and its linkage to the institution’s resilience plan. Readers may refer to our July 26, 2010 article in this column, “Stress Testing as a governance tool,” for more guidance.

While the overlay mechanism prepares us for the foreseeing function and expansive view, let’s not lose sight of the tightening of the data, systems and processes within the base ECL model. In particular, for the PD determination for the corporate and institutional exposures, we are recommending the following that should be viewed as loops and iterations rather than as a set of finite linear steps:

1. Segmentation process — covering the traditional data processing and management, risk profiles and internal risk rating system, with a subset for emerging and unstructured data capture assessments, “Big Data Small Data” initiatives, and clustering of observed attributes and properties.

2. Credit evaluation — covering mainly the financial condition, industry assessment and outlook and management quality of the corporate and institutional customers.

3. Assessment of factors and variables — covering both single factor and multifactor analysis, analysis and binning, and other approaches used to ascertain the relationship between the data points to the intuition and judgment of experts to be used for the model selection step.

4. Model selection — covering model runs that will result in candidate integrated models (composed of main and sub-models); these models initially start with an optimization algorithm of instructions and eventually “learn” over time; it may take another two to three learning rounds over the next 12 to 15 months to help stabilize the PD models.

5. PD transformation — covering the derivation of the through-the-cycle and point-in-time PD from the models chosen.

6. Portfolio analytics — assessment of the results against the internal policies and portfolio management, which then feeds back to the segmentation process.

This “future-proofing” recommendation will help FIs transition from parametric thinking to the rise of coding drivers — specifically on the adoption of machine learning while monitoring any progress on artificial intelligence for risk management and provisioning calculations.

At this point, the emerging parametric thinking underlying the ECL calculation has established the boundaries of PD and Loss-Given Default (LGD) to reflect both idiosyncratic properties and to a certain extent — through the overlay mechanism — the systematic risk that the obligor, or broadly the portfolio, is exposed to. But what is this systematic risk factor? Are we still talking about the generic market or financial economy? Or should this now be expanded to include “funding the real economy” discussions?

The connection between Basel and IFRS 9 has been limited so far to excessive concentration, contagion and spill-over risks. What has not been covered are the network and transmission risks that arise from stagnation. In an upcoming article, we will examine its application to the areas that have the strongest potential to break inertia and have an impact on the economy — agriculture and infrastructure.

This article is for general information only and is not a substitute for professional advice where the facts and circumstances warrant. The views and opinion expressed above are those of the authors and do not necessarily represent the views of SGV & Co.

Christian G. Lauron is a Partner of SGV & Co.