Skip to main content

2011 | Buch

Statistical Tools for Finance and Insurance

herausgegeben von: Pavel Cizek, Wolfgang Karl Härdle, Rafał Weron

Verlag: Springer Berlin Heidelberg

insite
SUCHEN

Über dieses Buch

Statistical Tools for Finance and Insurance presents ready-to-use solutions, theoretical developments and method construction for many practical problems in quantitative finance and insurance. Written by practitioners and leading academics in the field, this book offers a unique combination of topics from which every market analyst and risk manager will benefit.

Features of the significantly enlarged and revised second edition:

Offers insight into new methods and the applicability of the stochastic technologyProvides the tools, instruments and (online) algorithms for recent techniques in quantitative finance and modern treatments in insurance calculationsCovers topics such as

- expected shortfall for heavy tailed and mixture distributions*

- pricing of variance swaps*

- volatility smile calibration in FX markets

- pricing of catastrophe bonds and temperature derivatives*

- building loss models and ruin probability approximation

- insurance pricing with GLM*

- equity linked retirement plans*(new topics in the second edition marked with*)Presents extensive examples

Inhaltsverzeichnis

Frontmatter

Finance

Frontmatter
1. Models for heavy-tailed asset returns
Abstract
Many of the concepts in theoretical and empirical finance developed over the past decades – including the classical portfolio theory, the Black-Scholes-Merton option pricing model or the RiskMetrics variance-covariance approach to Value at Risk (VaR) – rest upon the assumption that asset returns follow a normal distribution. But this assumption is not justified by empirical data! Rather, the empirical observations exhibit excess kurtosis, more colloquially known as fat tails or heavy tails (Guillaume et al., 1997; Rachev and Mittnik, 2000). The contrast with the Gaussian law can be striking, as in Figure 1.1 where we illustrate this phenomenon using a ten-year history of the Dow Jones Industrial Average (DJIA) index.
Szymon Borak, Adam Misiorek, Rafał Weron
2. Expected shortfall for distributions in finance
Abstract
It has been nearly 50 years since the appearance of the pioneering paper of Mandelbrot (1963) on the non-Gaussianity of financial asset returns, and their highly fat-tailed nature is now one of the most prominent and accepted stylized facts. The recent book by Jondeau et al. (2007) is dedicated to the topic, while other chapters and books discussing the variety of non-Gaussian distributions of use in empirical finance include McDonald (1997), Knight and Satchell (2001), and Paolella (2007).
Simon A. Broda, Marc S. Paolella
3. Modelling conditional heteroscedasticity in nonstationary series
Abstract
A vast amount of econometrical and statistical research deals with modeling financial time series and their volatility, which measures the dispersion of a series at a point in time (i.e., conditional variance). Although financial markets have been experiencing many shorter and longer periods of instability or uncertainty in last decades such as Asian crisis (1997), start of the European currency (1999), the “dot-Com” technology-bubble crash (2000–2002) or the terrorist attacks (September, 2001), the war in Iraq (2003) and the current global recession (2008–2009), mostly used econometric models are based on the assumption of stationarity and time homogeneity; in other words, structure and parameters of a model are supposed to be constant over time. This includes linear and nonlinear autoregressive (AR) and moving-average models and conditional heteroscedasticity (CH) models such as ARCH (Engel, 1982) and GARCH (Bollerslev, 1986), stochastic volatility models (Taylor, 1986), as well as their combinations.
Pavel Čížek
4. FX smile in the Heston model
Abstract
The universal benchmark for option pricing is flawed. The Black-Scholes formula is based on the assumption of a geometric Brownian motion (GBM) dynamics with constant volatility. Yet, the model-implied volatilities for different strikes and maturities of options are not constant and tend to be smile shaped (or in some markets skewed). Over the last three decades researchers have tried to find extensions of the model in order to explain this empirical fact.
Agnieszka Janek, Tino Kluge, Rafał Weron, Uwe Wystup
5. Pricing of Asian temperature risk
Abstract
Global warming increases weather risk by rising temperatures and increasing between weather patterns. PricewaterhouseCoopers (2005) releases the top 5 sectors in need of financial instruments to hedge weather risk. An increasing number of business hedge risks with weather derivatives (WD): financial contracts whose payments are dependent on weather-related measurements.
Fred Espen Benth, Wolfgang Karl Härdle, Brenda Lopez Cabrera
6. Variance swaps
Abstract
Traditionally volatility is viewed as a measure of variability, or risk, of an underlying asset. However recently investors have begun to look at volatility from a different angle, variance swaps have been created.
Wolfgang Karl Härdle, Elena Silyakova
7. Learning machines supporting bankruptcy prediction
Abstract
This work presents one of the more recent and efficient learning systems – support vector machines (SVMs). SVMs are mainly used to classify various specialized categories such as object recognition (Schölkopf (1997)), optical character recognition (Vapnik (1995)), electric load prediction (Eunite (2001)), management fraud detection (Rätsch and Müller (2004)), and early medical diagnostics. It is also used to predict the solvency or insolvency of companies or banks, which is the focus of this work. In other words, SVMs are capable of extracting useful information from financial data and then label companies by giving them score values. Furthermore, probability of default (PD) values for companies can be calculated from those score values. The method is explained later.
Wolfgang Karl Härdle, Linda Hoffmann, Rouslan Moro
8. Distance matrix method for network structure analysis
Abstract
The distance matrix method goes in line with the network analysis of market structure such as clustering analysis (Focardi and Fabozzi, 2004), geometry of crashes (Araujo and Louca, 2007), degree distribution of nodes (Boss et al., 2004; Liu and He, 2009) etc. These methods allow, among other things, to investigate the time evolution of correlations in time series such as stocks. The analysis of such time series correlations has emerged from investigations of portfolio optimization. The standard approach is based on the cross-correlation matrix analysis and optimizations of share proportions (see e.g. Adams et al., 2003; Cuthberson and Nitzsche, 2001). The basic question regarding what the most desirable proportion are among different shares in the portfolio lead to the introduction of a distance between time series, and in particular, of the ultrametric distance, which has become a classical method of correlation analysis between stocks (Bonanno et al., 2001; Mantegna and Stanley, 2000). The method allows to analyze the structure of the market, and therefore, simplifies the choice of shares. In fact, this question about structure of the stock market should be tackled before portfolio optimization.
Janusz Miśkiewicz

Insurance

Frontmatter
9. Building loss models
Abstract
A loss model or actuarial risk model is a parsimonious mathematical description of the behavior of a collection of risks constituting an insurance portfolio. It is not intended to replace sound actuarial judgment. In fact, according to Willmot (2001), a well formulated model is consistent with and adds to intuition, but cannot and should not replace experience and insight. Moreover, a properly constructed loss model should reflect a balance between simplicity and conformity to the data since overly complex models may be too complicated to be useful.
Krzysztof Burnecki, Joanna Janczura, Rafał Weron
10. Ruin probability in finite time
Abstract
In examining the nature of the risk associated with a portfolio of business, it is often of interest to assess how the portfolio may be expected to perform over an extended period of time. One approach involves the use of ruin theory (Panjer and Willmot, 1992). Ruin theory is concerned with the excess of the income (with respect to a portfolio of business) over the outgo, or claims paid. This quantity, referred to as insurer’s surplus, varies in time. Specifically, ruin is said to occur if the insurer’s surplus reaches a specified lower bound, e.g. minus the initial capital. One measure of risk is the probability of such an event, clearly reflecting the volatility inherent in the business. In addition, it can serve as a useful tool in long range planning for the use of insurer’s funds.
Krzysztof Burnecki, Marek Teuerle
11. Property and casualty insurance pricing with GLMs
Abstract
The purpose of insurance rate making is to ensure that the book of business generates enough revenue to pay claims, expenses, and make profit. Apart from managing the overall level the actuary also needs to design a rating segmentation structure. The insurance company must understand which policies are more and which are less likely to generate claims. The price of insurance should reflect these differences.
Jan Iwanik
12. Pricing of catastrophe bonds
Abstract
Catastrophe (CAT) bonds are one of the more recent financial derivatives to be traded on the world markets. In the mid-1990s a market in catastrophe insurance risk emerged in order to facilitate the direct transfer of re-insurance risk associated with natural catastrophes from corporations, insurers and reinsurers to capital market investors. The primary instrument developed for this purpose was the CAT bond.
Krzysztof Burnecki, Grzegorz Kukla, David Taylor
13. Return distributions of equity-linked retirement plans
Abstract
In the recent years an increasing demand for capital guaranteed equity-linked life insurance products and retirement plans has emerged. In Germany, a retirement plan, called Riester-Rente, is supported by the state with cash payments and tax benefits. Those retirement plans have to preserve the invested capital. The company offering a Riester-Rente has to ensure that at the end of the saving period at least all cash inflows are available. Due to the investors demand for high returns, banks and insurance companies are not only offering saving plans investing in riskless bonds but also in products with a high equity proportion. For companies offering an equity-linked Riester-Rente the guarantee to pay out at least the invested capital is a big challenge. Due to the long maturities of the contracts of more than 30 years it is not possible to just buy a protective put. Many different concepts are used by banks and insurance companies to generate this guarantee or to reduce the remaining risk for the company. They vary from simple Stop Loss strategies to complex dynamic hedging strategies. In our work we analyze the return distribution generated by some of these strategies.
Nils Detering, Andreas Weber, Uwe Wystup
Backmatter
Metadaten
Titel
Statistical Tools for Finance and Insurance
herausgegeben von
Pavel Cizek
Wolfgang Karl Härdle
Rafał Weron
Copyright-Jahr
2011
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-18062-0
Print ISBN
978-3-642-18061-3
DOI
https://doi.org/10.1007/978-3-642-18062-0