Differential machine learning (ML) is an extension of supervised learning, where ML models are trained on examples of not only inputs and labels but also differentials of labels to inputs, applicable in all situations where high quality first order derivatives wrt training inputs are available.
In the context of financial Derivatives and risk management, pathwise differentials are efficiently computed with automatic adjoint differentiation (AAD). Differential machine learning gives us unreasonably effective pricing and risk approximation. We can produce fast pricing analytics in models too complex for closed form solutions, extract the risk factors of complex transactions and trading books, and effectively compute risk management metrics like reports across a large number of scenarios, backtesting and simulation of hedge strategies, or regulations like XVA, CCR, FRTB or SIMM-MVA.
The article focuses on differential deep learning (DL), arguably the strongest application. Standard DL trains neural networks (NN) on punctual examples, whereas differential DL teaches them the shape of the target function, hence the performance. We included numerical examples, both idealized and real world.
The third and final part of my paper “Computation graphs for AAD and Machine Learning part III: Application to Derivatives Sensitivities” was just published in Wilmott, the third and final in a series of three articles with code dedicated to AAD and Computational Finance in general. It covers computation graphs, backpropagation, AAD and implementation in finance, taking inspiration in the recent achievements of the Superfly Analytics group of Danske Bank.
A. Savine, “Computation graphs for aad and machine learning part iii: application to derivatives sensitivities” Wilmott, vol. 2020, iss. 106, p. 24-39, 2020.