Dynamic Sensitivities and Initial Margin via Chebyshev Tensors
This paper presents how to use Chebyshev Tensors to compute dynamic sensitivities of financial instruments within a Monte Carlo simulation. Dynamic sensitivities are then used to compute Dynamic Initial Margin as defined by ISDA (SIMM). The technique is benchmarked against the computation of dynamic sensitivities obtained by using pricing functions like the ones found in risk engines. We obtain high accuracy and computational gains for FX swaps and Spread Options.
Denting the FRTB IMA computational challenge via Orthogonal Chebyshev Sliding Technique
In this paper we introduce a new technique based on high-dimensional Chebyshev Tensors that we call Orthogonal Chebyshev Sliding Technique. We implemented this technique inside the systems of a tier-one bank, and used it to approximate Front Office pricing functions in order to reduce the substantial computational burden associated with the capital calculation as specified by FRTB IMA. In all cases, the computational burden reductions obtained were of more than 90%, while keeping high degrees of accuracy, the latter obtained as a result of the mathematical properties enjoyed by Chebyshev Tensors.
Boosted Interpolation Grids
The power of Deferred Object Loading
A central challenge that FRTB is bringing to banks is how to utilise existing infrastructure in the context of the P&L Attribution Test (PLAT); in particular, how to improve interpolation grids for fast pricing. The new MoCaX functionality, Deferred Object Loading, provides an ideal solution. MoCaX technology delivers an interpolation scheme that is guaranteed to converge ultra-fast with the original pricing function. This means that a MoCaX “smart” grid needs very few points to give ultra-high pricing accuracy. With the newly added functionality, MoCaX can create these smart grids without the need for any specific information of the particular trade at stake; the same way banks currently create standard-interpolation grids.
This solution is ideal for existing interpolation grid infrastructures because all the inter-systems communications operate in exactly the same way and, in addition, it also delivers ultra-accurate pricing. Hence, PLAT can be passed (high accurate pricing) and infrastructure costs decrease (fewer calls to the pricer are needed).
Consultative Document. Reducing variation in credit risk-weighted assets – constraints on the use of internal model approaches.
Comments for the Basel Committee on Banking Supervision
The Basel consultation proposes to remove risk sensitive models, or put floors to them, for Counterparty Credit Risk capital calculations of OTC derivatives.
I believe that, while the Committee’s objectives are understandable, the changes proposed would have very damaging consequences indeed. For that reason I submitted my view to the committee as an independent expert in the field.
In this paper, I explain that, should the new proposals be implemented,
- Systemic risk is created, as when simple models fail then they will fail in all banks in the world at the same time. This is precisely one of the key risks the regulatory landscape is mandated to remove from the trading system.
- Market dislocations are created, because trading activity will be steered to those areas with low capital requirements but high relative risk as this is where return on capital will be maximised.
- The distribution of risk is inhibited, as a consequence of non-economic-based excessive capital flat rules compared to advanced economic-based modelled capital. This would clearly have an (unnecessary) negative economic impact.
- Banks are un-incentivised to invest in sound risk management practices, as it sends messages to the industry that a bank does not need to be concerned about adopting its own internal risk controls.
- Innovation is inhibited. Innovation is at the core of any healthy industry, economy and society. Why would a bank invest in research and development of better risk management processes, methodologies and systems when the potential benefits that might accrue have been removed by regulation?
No doubt there is some benefit in simplifying rules for both regulator and for some players in the industry, but those benefits need to be balanced out with the reduction in the efficacy of those rules. One of the fundamental goals of those rules is to provide a stable financial system on which the real economy can be based and grow. My intention with this document is to bring to the Committee’s attention the fact that the new proposals materially reduce the efficacy of the capital framework; i.e., that they increase the instability of the financial system and constrain real economic growth
A Complete XVA Valuation Framework
Why the “Law of One Price” is dead
Pricing a book of derivatives has become quite a complicated task, even when those derivatives are simple in nature. This is the effect of the new trading environment, highly dominated by credit, funding and capital costs. In this paper the author formally sets up a global valuation framework that accounts for market risk (risk neutral price), credit risk (bilateral CVA), funding risk (FVA) of self-default potential hedging (LVA), collateral (CollVA) and market hedging positions (HVA), as well as tail risk (KVA). These pricing metrics create a framework in which we can comprehensively value trading activity. An immediate consequence of this is the emergence of a potential difference between fair value accounting and internal accounting. This piece of work also explains the difference between both of them, and how to perform calculations in both worlds in a realistic and coherent manner, demonstrating via arbitrage-impossibility arguments that an XVA frameworks should be used in both cases.
Interview with Ignacio Ruiz, conducted by PRMIA, on the subject of FVA
In this interview Ignacio explains the reasons behind FVA:
- What is FVA?
- When and how it should and shouldn’t be used
- Why is there so much controversy around it?
Optimal Right and Wrong Way Risk
This paper provides a comprehensive study on Right and Wrong Way Risk. In this paper, the authors…
- First explain the underlying source of this risk and how it applies to CVA as well as other credit metrics, together with a review of the available methodologies.
- Further to it, they provide a critique of the different models and their view as to which is the optimal framework, and why. This is done from the standpoint of a practitioner, with special consideration of practical implementation and utilisation issues.
- After that, they extend the current state-of-the-art research in the chosen methodology with a comprehensive empirical analysis of the market-credit dependency structure. They utilise 150 case studies, providing evidence of what is the real market-credit dependency structure, and giving calibrated model parameters as of January 2013.
- Next, using these realistic calibrations, they carry out an impact study of right-way and wrong-way risk in real trades, in all relevant asset classes (equity, FX and commodities) and trade types (swaps, options and futures). This is accomplished by calculating the change in all major credit risk metrics that banks use (CVA, initial margin, exposure measurement and capital) when this risk is taken into account.
- All this is done both for collateralised and uncollateralised trades.
- Finally, based on this impact study, the authors explain why a good right and wrong way risk model (as opposed to “any” model that gives a result) is cen- tral to financial institutions, furthermore describing the consequences of not having one.
The results show how these credit metrics can vary quite significantly, both in the “right” and the “wrong” ways. This analysis also illustrates the effect of collateral; for example, how a trade can have wrong-way risk when uncollateralised, but right-way risk when collaterallised.
FVA Calculation and Management
CVA, DVA, FVA and their interaction (Part II)
This is the second part of a dual paper on FVA. It explains how to calculate FVA and how to manage funding risk.
The calculation and management of funding risk for a portfolio of OTC derivatives is anything but trivial. In the first part of this paper (FVA Demystified), we discussed the ideas underlying FVA. We saw that it is an adjustment made to the price of a portfolio of derivatives to account for the future funding cost an institution might face. We also saw that it is very important to differentiate between the Price of a derivative (the amount of money we would get if we sell the derivative) and the Value to Me (the Price minus my cost of manufacturing the derivative). In this paper, we are going to investigate the practicalities of FVA. We will see how FVA can be calculated and managed. If we have a proper CVA system, calculating FVA is not too difficult, subject to a few reasonable assumptions. This can be achieved because a good CVA Monte Carlo simulation already calculates many of the inputs needed to compute FVA. Also, we will explain the role of FVA desks in current large organisations, as well as how FVA can be risk-managed and hedged. Finally, we propose a management set up for CVA and FVA, and understand why a number of institutions have decided to join both desks.
CVA, DVA, FVA and their interaction (Part I)
This is the first part of a dual paper on FVA. This one offers a review of the concept behind FVA, and how it interacts with CVA and DVA.
There is currently a rather heated discussion about the role of Funding Value Adjustment (FVA) when pricing OTC derivative contracts. On one hand, theorists claim that value should not be accounted for as it leads to arbitrage opportunities. On the other hand, practitioners say they need to account for it, as otherwise their cost base is not reflected in the price of a contract. On the surface these claims seem to be contradictory, but upon closer examination in fact they are not. In this paper, we define FVA, explain its role, how it interacts with CVA, and how it should (and should not) be used in an organisation. We shall see, however surprising it may sound, that this debate can be seen as a semantic misunderstanding. The two sides of the argument are using the same word `price’ for two very different things: `fair price’ and `value to me’. Noting this, the paradox disappears, and we may properly understand the role of CVA, DVA and FVA in an organisation.
Funding Value Adjustment… to be or not to be
Presentation in the Oxford-Man institute at the Oxford University discussing pros and cons of the widely debated FVA
- FVA v. CVA
- Context in which FVA makes sense
- Why and when FVA may (or may not) be needed
- The practitioner v. the academic view
Backtesting Counterparty Risk: How Good is your Model?
Forthcoming publication in the Journal of Credit Risk
Backtesting Counterparty Credit Risk models is anything but simple. Such backtesting is becoming increasingly important in the financial industry since both the CCR capital charge and CVA management have become even more central to banks. In spite of this, there are no clear guidelines by regulators as to how to perform this backtesting. This is in contrast to Market Risk models, where the Basel Committee set a strict set of rules in 1996 which are widely followed. In this paper, the author explains a quantitative methodology to backtest counterparty risk models. He expands the three-color Basel Committee scoring scheme from the Market Risk to the Counterparty Credit Risk framework. With this methodology, each model can be assigned a color score for each chosen time horizon. Financial institutions can then use this framework to assess the need for model enhancements and to manage model risk. The author has implemented this framework in Tier-1 a financial institution; the model report it generated was sent to the regulators for IMM model approval. The model was approved a few months later.
Modelling Credit Spreads for Counterparty Risk: Mean-Reversion is not Needed
Published in Intelligent Risk, Oct’12
When modelling credit spreads, there is some controversy in the market as to whether they are mean-reverting or not. This is particularly important in the context of counterparty risk, at least for risk management and capital calculations, as those models need to backtest correctly and, hence, they need to follow the “real” measure, as opposed to the “risk-neutral” one. This paper shows evidence that the credit spreads of individual corporate names, by themselves, are not mean-reverting. Our results also suggest that a mean-reversion feature should be implemented in the context of joint spread-default modelling, but not in a spread-only model.
Advanced Counterparty Risk and CVA via Stochastic Volatility
Forthcoming publication in the Journal of Financial Transformation
Exposure models in the context of counterparty risk have become central to financial institutions. They are a main driver of CVA pricing, capital calculation and risk management. It is general practice in the industry to use constant-volatility normal or log-normal models for it. Ignacio Ruiz and Ricardo Pachón explain some of the strong limitations of those models and show how stochastic volatility can improve the situation substantially. This is shown with illustrative examples that tackle day-to-day problems that practitioners face. Using a coupled Black-Karasinski model for the volatilty and a GBM model for the spot as an example, it is shown how stochastic volatility models can provide tangible benefits by improving netting effects, CVA pricing accuracy, regulatory capital calculation, initial margin calculations and quality of exposure management.
Technical Note: On Wrong Way Risk
Forthcoming publication in the Journal of Financial Transformation
Wrong Way Risk can be of crucial importance when computing counterparty risk measurements like EPE or PFE profiles. It appears when the default probability a given counterparty is not independent of its portfolio value. There are a number of approaches in the literature but, to the author’s knowledge, they all fail to provide either a computationally efficient approach or an intuitive methodology, or both. This technical note tackles this problem and describes an intuitive and fairly easy method to account for Wrong Way Risk with minimal added computational effort.
Credit Value Adjustment (CVA) has been one of the “hot topics” in the financial industry since 2009. There have been several papers on the subject and the topic has been widely discussed in all banking conferences around the world. However, often, the fundamentals of this topic have been misunderstood or misinterpreted. Arguably, this is the result of a typical problem within quants (I am a quant): we have a tendency to explain things in unnecessarily complicated ways. This paper aims to bridge that gap. It will explain using simple examples how CVA has been priced into banking products for centuries and how it is, in fact, the root of the banking industry since its origin. It will show how CVA is nothing more than the application to modern financial derivatives of very simple and old concepts, and how its misinterpretation can lead to accounting mismanagement. The mathematical formulation for it will not be included in this text as it can be found in several publications.
CVA: Default Probability ain’t matter?
CVA can be priced using market implied ‘risk-neutral’ or historical ‘real-world’ parameters. There is no consensus in the market as to which approach to use. This paper illustrates in simple terms the fundamental differences between the two approaches and the consequences of using each of them.