Several type of calculations ca be optimised

Our book illustrates in detail several examples
Book Details
Counterparty Credit Risk

The pricing step in Counterparty Credit Risk (CCR) Monte Carlo simulation causes a computational bottleneck. The way we propose to optimise it is taking advantage of the reduced dimension of factor models in the simulation, to generate a replica of the pricing functions that are

    1. Computationally easy to calibrate
    2. Highly accurate (e.g. 1e-5)
    3. Ultra-fast to evaluate

The replica can be created via Chebyshev Tensors or Deep Neural Nets. Which is best depends on the details of the Monte Carlo engine and the portfolio.

XVA calculations, as well as IMM capital and PFE simulations can be accelerated orders of magnitude on this way.

The number of XVA sensitivities to be computed can also increase by orders of magnitude too.

The fundamentals of the proposed approached was published in Risk Magazine, Cutting Edge, Apr’12.

Market Risk

Sensitivity-based pricing in Market Risk engines has important limitations. Standard revaluation grids are used as an upgrade,  but they also carry relevant errors to the pricing step.

Revaluation grids can be improved by orders of magnitude by selecting the Chebyshev points (instead of the traditional ones) for interpolation, and using polynomial interpolants (instead of the traditional linear or cubic spline interpolation).

For IMA FRTB, we have design a computational solution based on Orthogonal Chebyshev Sliders, published in Wilmott Magazine, Jan’21. It decreases the computational burden of full revaluation, passing in all the tests the P&L attribution tests.

Simulation of Sensitivities and Dynamic Initial Margin (DIM)

Simulating sensitivities inside a Monte Carlo simulation is anything by easy. In the paper published in Risk Magazine, Cutting Edge section, Apr’21, we show how Chebyshev Tensors can be used for that purpose.

This technique uses the already existing pricing library, creating a replica of their sensitivity function based on the value it has on a small number of Chebyshev points.

Once sensitivities have been simulated, subsequent simulations can be carried out. We illustrate so with a simulation of Initial Margin (SIMM) that has an accuracy smaller than 1e-3 in all quantiles of the simulation and time steps.

Pricing Model Calibration

Some models like Rough Volatility models are highly valuable because the are very good at replicating market observed variables (e.g. volatility skew), but are computationally very expensive to calibrate.

A series of remarkable papers (e.g. Deep Learning Volatility, by Horvath et al), Deep Neural Nets were used to alleviate this problem, with very good results. A somewhat limitation of this solution is that a vast number of computationally-cost pricing function calls need to be run in order to calibrate the Neural Net (e.g. 1e6).

This problem is ideal for Chebyshev Tensors and its remarkable mathematical properties as function approximators. In this pre-print paper we explain the approach and show several tests, seeing that Chebyshev Tensors are 10 to 100 times more efficient to do this job than Deep Neural Nets.

Implied Volatility Function Evaluation

The evaluation in a computer of the inverse of the Black-Scholes option pricing formula, known as the Implied Volatility function, is far from trivial. In the paper The Chebyshev Method for the Implied Volatility, by K. Glau et al, show how this computation can be done very efficiently with the aid of Chebyshev methods.

Portfolio Optimisation

In the chapter dedicated to this topic in our book, we show how to run optimisation problems with two examples

  • Balance Sheet – we collaborated with a software vendor specialised in banking book balance sheet management. A Chebyshev Tensor was built for a whole balance sheet, with the help of Tensor Extension Algorithms. The goal of the exercise was to find the optimal portfolio configuration in the balance sheet that would maximise its Net Interest Income (NII), subject to a number of regulatory constrains. An optimisation routine was run for the balance sheet, in which the NNI function had been replicated by a fast-to-compute Chebyshev Tensor. The output increased NII by 10%.
  • Minimal Funding Cost of Initial Margin – In a testing portfolio, 16 counterparties were artificially created, each having 100 randomly-generated IR Swaps, with realistic conditions. Could we find pair of payer/receiver swaps that, allocated to different counterparties, would decrease the total Margin Valuation Adjustment (MVA) of the portfolio? An ultra-fast and accurate MVA calculation was created with Chebyshev Tensor. A Differential Evolution for Integer Programming optimiser was used, and pairs of optimising swaps were found that reduced MVA between 10% and 30%.
Pricing Function Cloning

Often, in banks, pricing libraries live in one system, while risk calculation engines live in other systems, that have difficult at interacting with each other. Chebyshev Tensors and Deep Neural Nets can be used to “clone” a pricing function from the pricing library to the risk engine. This can be optimally done when in conjunction with Composition Techniques or other dimensionality reduction methods.