### Dynamic Initial Margin via Chebyshev Spectral Decomposition

We present an accurate and computationally efficient method, based on Chebyshev Spectral Decomposition, to stochastically compute the Initial Margin of financial products within a Monte Carlo simulation, via sensitivities simulation. This methodology is compared in terms of accuracy, efficiency, and implementation/maintenance costs with common techniques used for the same purpose, such as amortisation-based, regression-based and Adjoint Algorithmic Differentiation. Measured in terms of these criteria, the methodologies based on Chebyshev interpolants offer an optimal solution and set a new benchmark standard for the industry.

### Webinar: Chebyshev Spectral Decomposition for Ultra-efficient Risk Calculations

- The power of Chebyshev – MoCaX as a Smart interpolation scheme
- Selection of interpolating points and functions
- Chebyshev nodes and Chebyshev polynomials in the context of Risk Calculations
- Theoretical basis: three fundamental theorems
- Example: Parametric Chebyshev interpolation for Risk Calculations
- Practical cases studies:

- CVA
- CVA on exotics
- Cost effective IMA-FRTB (passing P&L Attribution tests!)
- Dynamic Initial Margin
- Accurate MVA
- Ultra-fast XVA sensitivities
- Commercial benefits: reduction of hardware costs, effective computation of risk metrics, hedging regulatory risk
- Generic AAD via Chebyshev Decomposition

### Chebyshev Methods for Ultra-efficient Risk Calculations (Paper)

#### Spectral Decomposition Applications in Risk Management

Financial institutions now face the important challenge of having to do multiple portfolio revaluations for their risk computation. The list is almost endless: from XVAs to FRTB, stress testing programs, etc. These computations require from several hundred up to a few million revaluations. The cost of implementing these calculations via a “brute-force” full revaluation is enormous. There is now a strong demand in the industry for algorithmic solutions to the challenge.

In this paper we show a solution based on Chebyshev interpolation techniques. It is based on the demonstrated fact that those interpolants show exponential convergence for the vast majority of pricing functions that an institution has. In this paper we elaborate on the theory behind it and extend those techniques to any dimensionality. We then approach the problem from a practical standpoint, illustrating how it can be applied to many of the challenges the industry is currently facing. We show that the computational effort of many current risk calculations can be decreased orders of magnitude with the proposed techniques, without compromising accuracy.

Illustrative examples include XVAs and IMM on exotics, XVA sensitivities, Initial Margin Simulations, IMA-FRTB and AAD.

### Chebyshev Decomposition for Ultra-efficient Risk Calculations (Presentation)

#### The Underlying Methods inside MoCaX Intelligence

This PDF is the slides of the presentation in the 13th Fixed Income conference, Florence 2017, in which the methodology underlying MoCaX Intelligence was made public to the wide audience.

Chebyshev decomposition methods have been around for years. In this talk, I. Ruiz explained how they can be used to increase the efficiency of Risk Calculation by orders of magnitude. Examples shown include

- CVA, FVA and IMM capital
- Dynamic Initial Margin simulation
- IMA-FRTB
- Inter-systems Pricer “Cloning”
- Portfolio Pricing Compression
- AAD
It is shown how Chebyshev Decompositions can decrease the computational effort by orders of magnitude without loss of accuracy from full-revaluation..

### Boosted Interpolation Grids

#### The power of Deferred Object Loading

A central challenge that FRTB is bringing to banks is how to utilise existing infrastructure in the context of the P&L Attribution Test (PLAT); in particular, how to improve interpolation grids for fast pricing. The new MoCaX functionality, Deferred Object Loading, provides an ideal solution. MoCaX technology delivers an interpolation scheme that is guaranteed to converge ultra-fast with the original pricing function. This means that a MoCaX “smart” grid needs very few points to give ultra-high pricing accuracy. With the newly added functionality, MoCaX can create these smart grids without the need for any specific information of the particular trade at stake; the same way banks currently create standard-interpolation grids.

This solution is ideal for existing interpolation grid infrastructures because all the inter-systems communications operate in exactly the same way and, in addition, it also delivers ultra-accurate pricing. Hence, PLAT can be passed (high accurate pricing) and infrastructure costs decrease (fewer calls to the pricer are needed).

### IMA-FRTB – The Computational Challenge

#### A Benchmark Study of Fast-Pricing Solutions

For banks applying for the Internal Model Approach under the coming FRTB rules, P&L attribution tests (PLATs) are proving extremely challenging. Among those who have managed to pass these tests in simulated setups, there is common agreement that full revaluation is the best way to succeed in the P&L attribution tests. However, IMA-FRTB via the Front Office prices without any tailoring imposes outstanding hardware costs and IT challenges. The only practical solution seems to lie in numerical techniques that provide quasi-full-revaluation: very accurate and fast proxy-pricing

We run a case study with 60 trades (IR swaps, IR Bermudan Swaptions and Exotic Barrier Options) with 5 underlying NMRFs to compute the IMCC charges, VaR and P&L attribution tests. We compared different numerical pricing techniques: Taylor series expansions, linear interpolation Grids and “Smart” Grids via MoCaX Objects. This simple portfolio required 0.7 million revaluations for a full IMA-FRTB computation. All pricing techniques were much more computationally efficient than “brute-force” full revaluation, around 50 times, but the only technique that was able to pass the difficult PLATs was “Smart” Grids via MoCaX Objects. That was the case because it is the only one of the tested pricing alternatives that is both ultra-fast and ultra-accurate.

In this paper we explain the details of this comparative study.

### Consultative Document. Reducing variation in credit risk-weighted assets – constraints on the use of internal model approaches.

#### Comments for the Basel Committee on Banking Supervision

The Basel consultation proposes to remove risk sensitive models, or put floors to them, for Counterparty Credit Risk capital calculations of OTC derivatives.

I believe that, while the Committee’s objectives are understandable, the changes proposed would have very damaging consequences indeed. For that reason I submitted my view to the committee as an independent expert in the field.

In this paper, I explain that, should the new proposals be implemented,

- Systemic risk is created, as when simple models fail then they will fail in all banks in the world at the same time. This is precisely one of the key risks the regulatory landscape is mandated to remove from the trading system.
- Market dislocations are created, because trading activity will be steered to those areas with low capital requirements but high relative risk as this is where return on capital will be maximised.
- The distribution of risk is inhibited, as a consequence of non-economic-based excessive capital flat rules compared to advanced economic-based modelled capital. This would clearly have an (unnecessary) negative economic impact.
- Banks are un-incentivised to invest in sound risk management practices, as it sends messages to the industry that a bank does not need to be concerned about adopting its own internal risk controls.
- Innovation is inhibited. Innovation is at the core of any healthy industry, economy and society. Why would a bank invest in research and development of better risk management processes, methodologies and systems when the potential benefits that might accrue have been removed by regulation?

No doubt there is some benefit in simplifying rules for both regulator and for some players in the industry, but those benefits need to be balanced out with the reduction in the efficacy of those rules. One of the fundamental goals of those rules is to provide a stable financial system on which the real economy can be based and grow. My intention with this document is to bring to the Committee’s attention the fact that the new proposals materially reduce the efficacy of the capital framework; i.e., that they increase the instability of the financial system and constrain real economic growth

### Webinar: The cost of trading under Initial Margin

#### Dynamic IM simulation for MVA

From September 2016 the financial industry is facing new regulation that is going to shape (again) the business of OTC derivatives: all tier-1 derivative dealers will have to post Initial Margin on their books of bilateral trades. From September 2020 nearly all financial institutions will have to comply. The Fed and ISDA have estimated the cost of this new framework in the many-billion zone.

In this webinar Ignacio Ruiz introduces the details of the new regulatory framework and presents computations of all XVAs (including MVA – the funding cost of Initial Margin) under different trading conditions for an illustrative interest rate swap and a swaption. He shows how the cost of trading is going back to the old uncollateralised levels with the new regulation. He also explains how MVA can have strong Wrong Way Risk and how not only the actual value of MVA will be very high but its volatility will be very strong too in stressed markets. Finally, he will explain how MVA can be calculated fast and accurately with the novel AGA (Algorithmic Greeks Acceleration) method.

Topics covered will include:

- The new economics of trading under Initial Margin
- Impact in the industry and in the market
- MVA vs. CVA, DVA, FVA, KVA
- The operational and regulatory path
- Dynamic SIMM simulation: how to do it fast and accurately
- Example calculations: swaps and swaptions

### The New Economics of OTC Derivatives: MVA vs. CVA, FVA & KVA

#### The impact of Initial Margin – Part 1

From September 2016 the financial industry is facing new regulation that is going to shape (again) the business of OTC derivatives: all tier-1 derivative dealers will have to post Initial Margin on their books of bilateral trades. From September 2020 nearly all financial institutions will have to comply. The Fed and ISDA have estimated the cost of this new framework in the many-billion zone.

In this first part of this paper we introduce the details of the new regulatory framework and present computations of all XVAs (including MVA) under different trading conditions for an illustrative interest rate swap. We see how the cost of trading is going back to the old uncollateralised levels with the new regulation. We also show how banks now have a clear economic incentive to clear trades via CCPs, but also that there will still be a strong market for bilateral trading. We see how MVA can have strong Wrong Way Risk and how not only the actual value of MVA will be very high but its volatility will be very strong too in stressed markets as a result of the leverage mechanism that the regulatory €50m threshold produces. Therefore, simulating the Initial Margin accurately within the XVA Monte Carlo engine is central for correct pricing and hedging.

### Dynamic SIMM via Algorithmic Greeks Acceleration (AGA)

#### The impact of Initial Margin – Part 2

From September 2016 the financial industry is facing new regulation that is going to shape (again) the business of OTC derivatives: all tier-1 derivative dealers will have to post Initial Margin on their books of bilateral trades. From September 2020 nearly all financial institutions will have to comply. The Fed and ISDA have estimated the cost of this new framework in the many-billion zone.

In the second part of this paper we present a method that enables accurate and fast computation of MVA via a dynamic SIMM simulation inside the XVA Monte Carlo engine. We show how important a good SIMM simulation is by seeing its effect in an illustrative trade with strong Initial Margin funding Wrong Way Risk effects (a swaption). To our knowledge this is the only general method that exists to simulate SIMM dynamically (hence calculate MVA) in an accurate and timely manner.

### A Complete XVA Valuation Framework

#### Why the “Law of One Price” is dead

Pricing a book of derivatives has become quite a complicated task, even when those derivatives are simple in nature. This is the effect of the new trading environment, highly dominated by credit, funding and capital costs. In this paper the author formally sets up a global valuation framework that accounts for market risk (risk neutral price), credit risk (bilateral CVA), funding risk (FVA) of self-default potential hedging (LVA), collateral (CollVA) and market hedging positions (HVA), as well as tail risk (KVA). These pricing metrics create a framework in which we can comprehensively value trading activity. An immediate consequence of this is the emergence of a potential difference between fair value accounting and internal accounting. This piece of work also explains the difference between both of them, and how to perform calculations in both worlds in a realistic and coherent manner, demonstrating via arbitrage-impossibility arguments that an XVA frameworks should be used in both cases.

### FVA Explained

Interview with Ignacio Ruiz, conducted by PRMIA, on the subject of FVA

In this interview Ignacio explains the reasons behind FVA:

- What is FVA?
- When and how it should and shouldn’t be used
- Why is there so much controversy around it?

### Optimal Right and Wrong Way Risk

This paper provides a comprehensive study on Right and Wrong Way Risk. In this paper, the authors…

- First explain the underlying source of this risk and how it applies to CVA as well as other credit metrics, together with a review of the available methodologies.
- Further to it, they provide a critique of the different models and their view as to which is the optimal framework, and why. This is done from the standpoint of a practitioner, with special consideration of practical implementation and utilisation issues.
- After that, they extend the current state-of-the-art research in the chosen methodology with a comprehensive empirical analysis of the market-credit dependency structure. They utilise 150 case studies, providing evidence of what is the real market-credit dependency structure, and giving calibrated model parameters as of January 2013.
- Next, using these realistic calibrations, they carry out an impact study of right-way and wrong-way risk in real trades, in all relevant asset classes (equity, FX and commodities) and trade types (swaps, options and futures). This is accomplished by calculating the change in all major credit risk metrics that banks use (CVA, initial margin, exposure measurement and capital) when this risk is taken into account.
- All this is done both for collateralised and uncollateralised trades.
- Finally, based on this impact study, the authors explain why a good right and wrong way risk model (as opposed to “any” model that gives a result) is cen- tral to financial institutions, furthermore describing the consequences of not having one.

The results show how these credit metrics can vary quite significantly, both in the “right” and the “wrong” ways. This analysis also illustrates the effect of collateral; for example, how a trade can have wrong-way risk when uncollateralised, but right-way risk when collaterallised.

### FVA Calculation and Management

#### CVA, DVA, FVA and their interaction (Part II)

This is the second part of a dual paper on FVA. It explains how to calculate FVA and how to manage funding risk.

The calculation and management of funding risk for a portfolio of OTC derivatives is anything but trivial. In the first part of this paper (FVA Demystified), we discussed the ideas underlying FVA. We saw that it is an adjustment made to the price of a portfolio of derivatives to account for the future funding cost an institution might face. We also saw that it is very important to differentiate between the Price of a derivative (the amount of money we would get if we sell the derivative) and the Value to Me (the Price minus my cost of manufacturing the derivative). In this paper, we are going to investigate the practicalities of FVA. We will see how FVA can be calculated and managed. If we have a proper CVA system, calculating FVA is not too difficult, subject to a few reasonable assumptions. This can be achieved because a good CVA Monte Carlo simulation already calculates many of the inputs needed to compute FVA. Also, we will explain the role of FVA desks in current large organisations, as well as how FVA can be risk-managed and hedged. Finally, we propose a management set up for CVA and FVA, and understand why a number of institutions have decided to join both desks.

### FVA Demystified

#### CVA, DVA, FVA and their interaction (Part I)

This is the first part of a dual paper on FVA. This one offers a review of the concept behind FVA, and how it interacts with CVA and DVA.

There is currently a rather heated discussion about the role of Funding Value Adjustment (FVA) when pricing OTC derivative contracts. On one hand, theorists claim that value should not be accounted for as it leads to arbitrage opportunities. On the other hand, practitioners say they need to account for it, as otherwise their cost base is not reflected in the price of a contract. On the surface these claims seem to be contradictory, but upon closer examination in fact they are not. In this paper, we define FVA, explain its role, how it interacts with CVA, and how it should (and should not) be used in an organisation. We shall see, however surprising it may sound, that this debate can be seen as a semantic misunderstanding. The two sides of the argument are using the same word `price’ for two very different things: `fair price’ and `value to me’. Noting this, the paradox disappears, and we may properly understand the role of CVA, DVA and FVA in an organisation.

### Funding Value Adjustment… to be or not to be

Presentation in the Oxford-Man institute at the Oxford University discussing pros and cons of the widely debated FVA

- FVA v. CVA
- Context in which FVA makes sense
- Why and when FVA may (or may not) be needed
- The practitioner v. the academic view

### Backtesting Counterparty Risk: How Good is your Model?

Forthcoming publication in the Journal of Credit Risk

Backtesting Counterparty Credit Risk models is anything but simple. Such backtesting is becoming increasingly important in the financial industry since both the CCR capital charge and CVA management have become even more central to banks. In spite of this, there are no clear guidelines by regulators as to how to perform this backtesting. This is in contrast to Market Risk models, where the Basel Committee set a strict set of rules in 1996 which are widely followed. In this paper, the author explains a quantitative methodology to backtest counterparty risk models. He expands the three-color Basel Committee scoring scheme from the Market Risk to the Counterparty Credit Risk framework. With this methodology, each model can be assigned a color score for each chosen time horizon. Financial institutions can then use this framework to assess the need for model enhancements and to manage model risk. The author has implemented this framework in Tier-1 a financial institution; the model report it generated was sent to the regulators for IMM model approval. The model was approved a few months later.

### Modelling Credit Spreads for Counterparty Risk: Mean-Reversion is not Needed

Published in Intelligent Risk, Oct’12

When modelling credit spreads, there is some controversy in the market as to whether they are mean-reverting or not. This is particularly important in the context of counterparty risk, at least for risk management and capital calculations, as those models need to backtest correctly and, hence, they need to follow the “real” measure, as opposed to the “risk-neutral” one. This paper shows evidence that the credit spreads of individual corporate names, by themselves, are not mean-reverting. Our results also suggest that a mean-reversion feature should be implemented in the context of joint spread-default modelling, but not in a spread-only model.

### Advanced Counterparty Risk and CVA via Stochastic Volatility

Forthcoming publication in the Journal of Financial Transformation

Exposure models in the context of counterparty risk have become central to financial institutions. They are a main driver of CVA pricing, capital calculation and risk management. It is general practice in the industry to use constant-volatility normal or log-normal models for it. Ignacio Ruiz and Ricardo Pachón explain some of the strong limitations of those models and show how stochastic volatility can improve the situation substantially. This is shown with illustrative examples that tackle day-to-day problems that practitioners face. Using a coupled Black-Karasinski model for the volatilty and a GBM model for the spot as an example, it is shown how stochastic volatility models can provide tangible benefits by improving netting effects, CVA pricing accuracy, regulatory capital calculation, initial margin calculations and quality of exposure management.

### Technical Note: On Wrong Way Risk

Forthcoming publication in the Journal of Financial Transformation

Wrong Way Risk can be of crucial importance when computing counterparty risk measurements like EPE or PFE profiles. It appears when the default probability a given counterparty is not independent of its portfolio value. There are a number of approaches in the literature but, to the author’s knowledge, they all fail to provide either a computationally efficient approach or an intuitive methodology, or both. This technical note tackles this problem and describes an intuitive and fairly easy method to account for Wrong Way Risk with minimal added computational effort.

### CVA Demystified

Credit Value Adjustment (CVA) has been one of the “hot topics” in the financial industry since 2009. There have been several papers on the subject and the topic has been widely discussed in all banking conferences around the world. However, often, the fundamentals of this topic have been misunderstood or misinterpreted. Arguably, this is the result of a typical problem within quants (I am a quant): we have a tendency to explain things in unnecessarily complicated ways. This paper aims to bridge that gap. It will explain using simple examples how CVA has been priced into banking products for centuries and how it is, in fact, the root of the banking industry since its origin. It will show how CVA is nothing more than the application to modern financial derivatives of very simple and old concepts, and how its misinterpretation can lead to accounting mismanagement. The mathematical formulation for it will not be included in this text as it can be found in several publications.

### CVA: Default Probability ain’t matter?

CVA can be priced using market implied ‘risk-neutral’ or historical ‘real-world’ parameters. There is no consensus in the market as to which approach to use. This paper illustrates in simple terms the fundamental differences between the two approaches and the consequences of using each of them.