Browse Research

Viewing 2501 to 2525 of 7695 results
2005
For the classical credibility regression model the dependence structure of the error terms is specialized to an autoregressive one. The corresponding, specialized credibility formula is given and an adequate structure parameter estimation method is described.
2005
In order to estimate the ultimate loss in long tail business, there is an increasing trend to separate the claims development triangles into individual triangles for basic and large losses and evaluate each of the chain-ladder-projections separately.
2005
This work is based on the considerations and results in Merz (2005). Starting from the continuous analogon of the generalized Hachmeister model in Mangold (1987) and Merz (2004), the work concentrates on four models, which are derived as special cases.
2005
The subject of this work is the development of a credibility theory in continuous time. A short introduction is followed by the stochastic modelling of the adequate premium calculation in continuous time. The terms known from the discrete theory will be transferred to a continuous time setting.
2005
This paper asks whether the transfer of risk from banking to non-banking institutions, such as insurers, has reduced risk for the financial system as a whole or merely shifted it to less transparent sectors. If the latter is the case, then it may be that new forms of risk and vulnerability are being introduced into the global financial system. Keywords: credit risk transfer; insurance industry; financial stability
2005
This paper addressess the impact of the implementation of the new international reporting standards on insurance companies. Keywords: financial reporting standards; IFRS 4; IASB; phase I and II
2005
In a deregulated insurance market, insurance carriers have an incentive to be innovative in their pricing decisions by segmenting their portfolios and designing new bonus-malus systems (BMS). This paper examines the evolution of market shares and claim frequencies in a two-company market, when one insurer breaks off the existing stability by introducing a super-discount class in its BMS.
2005
Insurance companies are currently under considerable pressure from regulators, analysts and investors to change their approach to risk and capital management. As insurers consider how to implement new ways to measure and manage their business in response to these demands, they would do well to heed the lessons learned in the banking industry, which has been on a similar path for the last decade.
2005
We consider d identically and continuously distributed dependent risks X1,...,Xd. Our main result is a theorem on the asymptotic behaviour of expected shortfall for the aggregate risks: there is a constant cd such that for large u we have E[¦²di=1Xi¨O¦²di=1Xi¡Ü-u]~-ucd. Moreover we study diversification effects in two dimensions, similar to our Value-at-Risk studies.
2005
The development and management of data resources that support property/casualty actuarial work are very challenging undertakings, especially in a high-volume transactional processing environment. In order to equip actuaries with the data resources necessary to excel in the performance of their functions, an Actuarial Data Management (ADM) support team is needed.
2005
Spring 2005 Forum Including the Reinsurance Call Papers These files are in Portable Document Format (PDF), you will need to download the Acrobat Reader to view the articles. Table of Contents Download Entire Volume 2005 CAS Reinsurance Call Papers On Optimal Reinsurance Arrangement Yisheng Bu, Ph.D
2005
This paper solves explicitly a simple equilibrium model with liquidity risk. In our liquidity-adjusted capital asset pricing model, a security's required return depends on its expected liquidity as well as on the covariances of its own return and liquidity with the market return and liquidity. In addition, a persistent negative shock to a security's liquidity results in low contemporaneous returns and high predicted future returns.
2005
Pricing credit-equity hybrids is a challenging task as the established pricing methodologies for equity options and credit derivatives are quite different.
2005
We consider d identically and continuously distributed dependent risks X1, . . .,Xd. Our main result is a theorem on the asymptotic behaviour of expected shortfall for the aggregate risks. Moreover we study diversification effects in two dimensions, similar to our Value-at-Risk studies in [2].
2005
Outsourcing information technology (IT) operations has been recognized to have important potential benefits, including cost reduction, improved quality of service, and access to technological expertise. Researchers and practitioners also recognize that, in some circumstances, IT outsourcing entails risk, and that it sometimes leads to undesirable consequences that are the opposite of the expected benefits.
2005
We develop a methodology for optimal design of financial instruments aimed to hedge some forms of risk that is not traded on financial markets. The idea is to minimize the risk of the issuer under the constraint imposed by a buyer who enters the transaction if and only if her risk level remains below a given threshold.
2005
Most practitioners favour a one-factor model (CAPM) when estimating expected return for an individual stock. For estimation of portfolio returns, academics recommend the Fama and French three-factor model. The main objective of this paper is to compare the performance of these two models for individual stocks. First, estimates for individual stock returns based on CAPM are obtained using different time frames, data frequencies, and indexes.
2005
This paper develops a likelihood-based methodology to estimate loss distributions and compute Capital at Risk in risk management applications. In particular, we deal with the problem of estimating severity distributions with censored and truncated operational losses, for which numerical maximization of the likelihood function by means of standard optimization tools may be difficult.
2005
Klaus Böcker and Claudia Klüppelberg investigate a simple loss distribution model for operational risk. They show that, when loss data are heavy-tailed (which in practice they are), a simple closed-form approximation for the OpVaR can be obtained. They apply this approximation in particular to the Pareto severity model, for which they obtain also a simple time scaling rule for the operational VaR.
2005
This paper compares a number of different extreme value models for determining the value at risk (VaR) of three LIFFE futures contracts. A semi-nonparametric approach is also proposed, where the tail events are modeled using the generalised Pareto distribution, and normal market conditions are captured by the empirical distribution function.