Mathematical Finance Seminar

December 6, 2001 , 5:30 PM to 7:00 PM

W. Bernard Lee and Yoric Perrin, Andersen

Practitioner's Guide to Credit Scenario Generation

Under the new Basel Accord, banks would be allowed to use internally generated credit-worthiness estimates to assess their credit risk exposures, but would have to comply with detailed methodologies and disclosure requirements. Banks could then provide analytical frameworks for each type of loan exposure, the result of which would be aggregated into estimates of potential losses and minimum capital requirements. Since the early 90s, institutions have been creating special purpose vehicles (SPV) in order to obtain more favorable credit ratings, and the method used by these institutions to calculate the expected losses of the SPV is a natural candidate for such an analytical framework. Given that the institution's primary objective in such cases is to obtain a favorable rating, it is not uncommon that any methodology to which the rating agency is accustomed will be accepted as given. While it appears that the rating agencies are well aware of the shortcomings of these approaches, the impact of migrating to a first-principle approach would be significant, so that there was no business justification to reconsider until recently. This presentation is intended to present a pragmatic framework using best-of-breed credit simulation techniques to address the expected loss calculation problem. Specifically, we will apply the first-principle approach to measure risk as the amount of money needed to reduce a defaultable portfolio to a default-free portfolio.

The most basic challenge with any firm-wide credit simulation is the lack of reliable data. Historical data that one may take for granted in market risk calculations are at best incomplete in the credit risk problem domain. Moreover, Monte Carlo simulation techniques readily applicable to market risk calculations begin to fail due to the large number of risk factors required in credit risk. We will demonstrate various computational techniques used to estimate usable variance/covariance matrices as well as methods to create Monte Carlo scenarios that can replicate these matrices. Once the Monte Carlo scenarios are generated, the appropriate drift conditions can be applied to ensure that the rates generated are consistent with arbitrage-free conditions. Additionally, the transition from a "real-world" drift to a "risk-neutral" drift at a nominated forward horizon also introduced non-trivial computational challenges. Another significant difference between market and credit risk computation is that the normally correlated relationships between risk factors tend to break down in times of crisis, especially so in credit. This turns out to be the case both for modeling rates (e.g. credit spreads) and default behaviors. Although attempts to model tail dependency behavior have been developed in market risk, we will show why these models may not be sufficient for modeling credit behavior, and demonstrate the potential benefits of copula-based models. We will illustrate the proposed approach using real market data. One reason why the authors have chosen this elaborate mathematical construction is that, once the problem of measuring expected loss is properly formulated, turning such a mathematical construction into a stochastic optimization problem becomes tractable. For instance, any cascading liability assumptions can be easily modeled algebraically, and a reasonable objective function would then be to define the institution's profit objective by maximizing expected loan incomes minus the sum of the paper coupons/principals and equity write-offs. The robustness of the mathematical construction would have a direct impact on whether meaningful solutions can be obtained, as opposed to trivial corner solutions.

It is the authors' hope that various elements of this proposed methodology will influence future debates on "best practice" techniques to address computational requirements arising from Basel II.