The financial industry consists mainly of three sectors: asset management, banking, and insurance. Across the financial industry, broadly speaking, there are four tightly connected, sequential functions: valuation, risk management, portfolio management, and performance analysis, refer to Figure 1.
Quantitative finance supports the above sequential building blocks of finance, in particular valuation, risk management, and portfolio management. Quantitative techniques are omnipresent in risk management. In essence, risk management amounts to learning the probability (denoted by “”) of a variety of future outcomes. This goal can only be attained with sophisticated statistical tools, jointly with financial engineering models. Quantitative techniques are also used in portfolio management, where not only advanced statistics (), but also optimization theory plays an important role.
Finally, quantitative techniques are heavily used for valuation purposes. In this arena, the future risk/return profile of a given instrument determines the value of the instrument, in one of two ways. First, using mathematical models again for the probability of future outcomes, an approach followed by actuaries. Second, using mathematical models based on the theory of arbitrage, an approach followed by desk quants for derivatives valuation. Arbitrage theory in turn relies on a different probability of future outcomes, called risk-neutral and denoted by “”.
Despite the dominant role in quantitative finance of the (risk management, portfolio management, and actuarial valuation), and the much more restricted role of the (derivatives valuation), the applications have attracted a tremendous number of scientists to finance. As a result, from the 80’s to the first decade of the 21st century, quantitative finance was identified with risk-neutral derivative pricing. In more recent years, the financial industry has witnessed a surge of interest in quantitative models and a decrease of interest in the .
Given the importance of and modeling in quantitative finance, we will now better clarify differences and similarities between these two worlds, following [Meucci, 2011c].
The main differences between and quantitative finance can be summarized in the following table.
|Risk/portfolio management||Derivatives pricing|
|Goal||Forecast the future||Extrapolate the present|
|Environment||Real world probability||Risk-neutral probability|
|Processes||Discrete-time series||Continuous-time martingales|
|Tools||Multivariate statistics||Itô calculus, PDE’s|
We proceed to discuss the above differences.
Risk management and portfolio management aim at forecasting and improving the distribution of future returns. This real world probability distribution is typically denoted by the blackboard font letter “”.
Based on the distribution, the asset management, or buy-side [W], community makes decisions on which financial instruments to purchase/sell, in order to improve the prospective profit-and-loss (P&L) profile of their portfolio. Similarly, the banks and insurance companies decide based on the distribution on which business lines to invest.
The quantitative theory of risk and portfolio management started with the mean-variance framework of [Markowitz, 1952]. Next, breakthrough advances were made with the capital asset pricing model (CAPM) and the arbitrage pricing theory (APT) developed by [Treynor, 1962], [Mossin, 1966], [Sharpe, 1964], [Lintner, 1965] and [Ross, 1976].
The above theories provide tremendous insight into the markets. However, they all assume that the probability distribution is known. In reality, the probability distribution must be estimated from available information. A major component of this information set is based on the past dynamics of values and other financial variables which are monitored at discrete time intervals and stored in the form of time series.
Estimation represents the main quantitative challenge in the world of risk and portfolio management. Estimation is based on the observation of the time series of multiple risk drivers. The analysis of time series requires advanced multivariate statistic and econometric techniques. Note that in risk and portfolio management it is important to estimate the joint distribution of all the instruments in that market, and thus financial instruments cannot be considered individually. Therefore dimension reduction techniques such as linear factor models play a central role in the world.
To address the above issues, in recent years a new breed of quants, the quants, has started to populate the financial industry, and more quants are being trained in the same master’s degree programs that were originally designed to train quants.
The goal of derivatives pricing is to determine the present fair value of a given financial instrument in terms of more liquid instruments whose value is determined by the law of supply and demand. Derivatives pricing is a special instance of the more general problem of valuation, i.e. determining the fair value of a given financial instrument.
Examples of instruments to be priced are convertible bonds, exotic options, mortgage backed securities, structured products, etc. Once a fair value has been determined, the sell-side [W] trader can make a market [W] on the instrument. Therefore, pricing is a complex ‘extrapolation’ exercise to determine the current market value of a financial instrument, which is then used by the sell-side community.
Quantitative derivatives pricing was initiated by [Bachelier, 1900] with the introduction of the most basic and most influential process, the Brownian motion (48.15), and its applications to the pricing of options. The theory remained dormant until [Merton, 1969] and [Black and Scholes, 1973] applied the second most influential process, the geometric Brownian motion (5.25), to option pricing.
The next important step is the fundamental theorem of asset pricing (29a.38), a simpler version of which reads
According to (1) the value of an instrument is arbitrage-free, and thus truly fair, only if , divided by the investment which has instantaneous risk-free return rate , can be written as the expected value (under a suitable probability measure) of the same ratio at any future time .
A process such as the ratio in (1) is called a martingale (47.13). The relationship (1) must hold for all times therefore the processes used for derivatives pricing are naturally set in continuous time.
The martingale (1) does not reward risk: the expected return of the risky investment over each subsequent step is the risk-free rate at that point in time: (29a.43). Thus the probability measure which makes the normalized price process a martingale is referred to as “risk-neutral” (29a.44) and is typically denoted by the blackboard font letter “”.
The risk-neutral measure can be obtained from the process of the underlying instrument, for example: from a geometric Brownian motion of a stock (5.25).
The derivative pricing quants who operate in the world are specialists with deep knowledge of the specific products they model. Instruments are priced individually, and thus the problems in the world are low-dimensional in nature.
Calibration is one of the main challenges in the world: once a continuous-time parametric process has been calibrated to a set of traded instruments through a relationship such as (1), a similar relationship is used to define the values of new derivatives.
The main quantitative tools necessary to handle continuous-time processes are Itô’s stochastic calculus, partial differential equations (PDE’s), partial integro-differential equations (PIDE’s), Fourier and Laplace analysis, etc. Throughout the past decades, these advanced techniques have attracted mathematicians, physicists and engineers to the field of derivatives pricing in the world.
From a comparison of the columns in Table 2, it appears as though
very different. In reality, commonalities between these two worlds abound and interactions occur frequently in different areas.
|Area of - commonality||Specific - overlap|
|Risk premium||Switch between and|
|Stochastic processes||Discrete and continuous time|
|Numerical methods||Trees, Monte Carlo|
|Statistical arbitrage||Pricing-based mean-reversion|
|Algorithmic trading||Applied to buy-side and sell-side|
We proceed to discuss the above commonalities, pointing to the areas in the present work where we address them in detail.
Mathematically, the risk-neutral probability and the real world probability associate different weights to the same possible outcomes for the same financial variables. The transition from one set of probability weights to the other defines the so-called “risk premium”. Knowledge of the risk premium allows us in principle to switch from one world to the other. Unfortunately, the correct estimation of the risk premium is a challenging task.
We define risk premium in the context of risk assessment in Section 7.2. We then use risk premium in valuation theory: arbitrage pricing theory (APT) in Section 29b.4 and actuarial valuation in Chapter 30. We also use risk premium in the context of systematic strategies in Step 9b.
Stochastic processes are the building blocks of any quantitative model, both in the world and in the world. Although quants focus on continuous risk-neutral processes and quants focus on discrete-time processes, the same models are used in both areas, possibly under different assumptions and names. Table 2.2 summarizes the main features of the most popular models.
We discuss stochastic processes in the context of Quest for invariance in Step 2.
The theoretical stochastic processes discussed above must be implemented in practice. In order to do so, the most popular numerical techniques are trees and Monte Carlo simulations.
Trees represent a process as an ever-expanding sequence of potential outcomes: the state of the world today will give rise to multiple possible outcomes tomorrow; each of these in turn will give rise to multiple possible outcomes the day after tomorrow, and so on. In general, with trees the number of potential outcomes (or nodes) grows exponentially as the time horizon increases. Instead, with Monte Carlo simulations, the number of possible outcomes, also known as paths, of a stochastic process is kept constant throughout the evolution of the process.
The computationally more costly trees are used when it is important to make decisions along the trajectory of the stochastic process, whereas Monte Carlo is used when only the process distribution is required. Therefore, in the world of risk and portfolio management, trees are used to design dynamic strategies, whereas Monte Carlo is used for risk monitoring purposes such as value-at-risk computations. In the world, trees are used for instance to price American options, which can be exercised before expiry, whereas Monte Carlo is used to price Asian options, i.e. options on the average value of an underlying instrument over a pre-specified period of time.
Hedging is a clear example where the world and the world interact directly.
Hedging aims at protecting the future P&L of a given position from a set of risk factors. Therefore, hedging is a world concept.
In order to determine the amounts of the hedging instruments to buy or sell, we need to compute the sensitivity of the given position and of the hedging instruments to those risk factors.
Such sensitivities are known as the “Greeks”. The most basic Greek is the “delta” of an option written on a given financial instrument, which is the sensitivity of the option to the underlying value. The delta of an option tells the trader how much of the underlying instrument to buy or to sell in order to protect the option from swings of the underlying.
The Greeks are computed using pricing models from the world and then applied in the world for hedging. Interestingly, those very same world pricing models can be derived based on the world concept of hedging.
We discuss hedging in the context of risk attribution in Section 8a.5.
The world has also moved into the world in the area of statistical arbitrage algorithms. The general steps of this interaction are as follows.
First, models (or other tools) are used to identify misalignments among financial instruments values today. Second, one assumes that the misaligned values will eventually converge to the values predicated by the models. Therefore, a potential expected return in the real world, or “alpha”, is identified as the difference between the predicated values (the future, “fair” value) and the current misaligned values. Third, if the alpha is positive, a long position is set up, i.e. the misaligned instruments are bought, in the expectation that they will reach their higher fair value and thus realize a profit; if the alpha is negative, a short position is set up, i.e. the misaligned instruments are sold.
We discuss statistical arbitrage in the context of cointegration in Section 2.8.
Historically, sell-side market makers have implemented algorithms to provide liquidity to the market, while charging a premium for their sell-side services. On the other hand, buy-side high-frequency traders have implemented algorithms that efficiently execute orders, absorbing liquidity and delivering profits.
More recently, the distinction between sell-side and buy-side has become blurry, as the two groups are becoming competitors.
We discuss algorithmic trading in the context of optimal execution in Chapter 10.