Browse Research

Viewing 1276 to 1300 of 7695 results
2010
With the release of Excel 2007 and SQL Server 2008, Microsoft has provided actuaries with a powerful and easy to use predictive modeling platform. This paper provides a brief overview of the SQL Server system. It then discusses the “Data Mining Client for Excel 2007” and explains how actuaries can use Excel to build predictive models, with little or no knowledge of the underlying SQL Server system.
2010
Response data (loss cost, claim frequency or claim severity) are often pre-adjusted with known factors and directly analyzed with generalized linear models (GLM). This paper shows that the exposure weights should also be adjusted if the Tweedie distribution with log link is used in such direct analysis.
2010
This bibliography has been compiled from publications which are available to the author and from references given in these publications. Some of the references are incomplete and others may be missing. References which are available to the author are indicated by the symbol ² preceding the title.
2010
This text outlines basic property/casualty insurance ratemaking concepts and techniques. It is intended to be a single educational text to prepare actuarial candidates practicing around the world for basic ratemaking. A key concept in the text is the fundamental insurance equation, which balances the expected future income and outgo of an insurance operation.
2010
Underinsured Motorist (UIM) coverage, also known as Family Protection coverage, is a component of most Canadian personal automobile policies, with similar coverage existing in many American states. Traditional ratemaking methods are not appropriate for UIM due to poor credibility of available data as well as the unique characteristics of the UIM coverage.
2010
This paper proposes a methodology to calculate the credibility risk premium based on the uncertainty of the risk premium (aka pure loss cost, pure premium), as estimated by the standard deviation of the risk premium estimator. An optimal estimator based on the uncertainties involved in the pricing process is constructed.
2010
Property/casualty reserves are estimates of losses and loss development and as such will not match the ultimate results. Sources of error include model error (the methodology used does not accurately reflect the development process), parameter error (incorrect model parameters), and process error (future development is random). This paper provides a comprehensive and practical methodology for quantifying risk that includes all three sources.
2010
This paper offers a methodology for calculating optimal bounds on tail risk probabilities by deriving upper and lower semiparametric bounds, given only the first two moments of the distribution.We apply this methodology to determine bounds for probabilities of two tail events. The first tail event occurs when two financial variables simultaneously have extremely low values.
2010
In this paper, linear mixed models are employed for estimation of structural parameters in credibility context. In particular, Hachemeister’s model and Dannenburg’s crossed classification model are considered. Maximum likelihood (ML) and restricted maximum likelihood (REML) methods are developed to estimate the variance and covariance parameters.
2010
In 2009, in the aftermath of the Global Financial Crisis, 140 American banks failed–and hundreds of other banks were classified as “problem institutions” by the FDIC. This has led to numerous books and articles examining the causes of systemic risk in our financial system. In this paper we step back in history, to see what we should have learned from a previous banking crisis, which occurred during the 1980s.
2010
This paper presents a bootstrap approach to estimate the prediction distributions of reserves produced by the Munich chain ladder (MCL) model. The MCL model was introduced by Quarg and Mack (2004) and takes into account both paid and incurred claims information. In order to produce bootstrap distributions, this paper addresses the application of bootstrapping methods to dependent data, with the consequence that correlations are considered.
2010
Robust statistical procedures have a growing body of literature and have been applied to loss severity fitting in actuarial applications. An introduction of robust methods for loss reserving is presented in this paper. In particular, following Tampubolon (2008), reserving models for a development triangle are compared based on the sensitivity of the reserve estimates to changes in individual data points.
2010
In this paper we construct a stochastic model and derive approximation formulae to estimate the standard error of prediction under the loss ratio approach of assessing premium liabilities. We focus on the future claims component of premium liabilities and examine the weighted and simple average loss ratio estimators.
2010
The behavior of competing insurance companies investigating insurance fraud follows one of several Nash Equilibria under which companies consider the claim savings, net of investigation cost, on a portion, or all, of the total claim. This behavior can reduce the effectiveness of investigations when two or more competing insurers are involved.
2010
Insurers purchase catastrophe reinsurance primarily to reduce underwriting risk in any one experience period and thus enhance the stability of their income stream over time. Reinsurance comes at a cost and therefore it is important to maintain a balance between the perceived benefit of buying catastrophe reinsurance and its cost.
2010
Motivation: Advanced calculations on large data sets provide important business insights. Such calculations must be flexible enough for the dynamic nature of advanced analytics done by actuaries and other high-skill users, yet must also leverage the power and stability of large-scale IT systems.
2010
Copulas are an elegant mathematical tool for decoupling a joint distribution into the marginal component and the dependence structure component; thus enabling us to model simultaneous events with a greater degree of flexibility. However, as with many statistical techniques, the application of copulas in practice is as much art as it is science.
2010
The rise and fall of subprime mortgage securitizations contributed in part to the ensuing credit crisis and financial crisis of 2008. Some participants in the subprime-mortgage-backed securities market relied at least in part on analyses grounded in the loss development factor (LDF) method, and many did not conduct their own credit analyses, relying instead on the work of others such as securities brokers and rating agencies.
2010
In his book, The Best Way to Rob a Bank is to Own One, William Black describes in detail the complex collusion between bankers, regulators, and legislators that brought about the Savings and Loan crisis of the 1980s and early 1990s. As part of the scheme, leverage was used to purchase bankrupt companies that became the basis for a Ponzi-like speculative bubble that ultimately collapsed.
2010
This paper argues that no single valuation basis is completely reliable: neither market price nor other alternatives can accurately measure value. Therefore, this paper proposes that a preferable solution is to simultaneously record two bases of valuation: market price and appraisal value.
2010
This article starts with primitive assumptions on preferences and risk. It then derives prices consistent with a social optimum within an insurance company and the consumer-level capital allocation implied therein. The allocation “adds up” to the total capital of the firm (a result echoing findings in the congestion pricing literature—where optimal tolls exactly cover the rental cost of the highway).
2010
This work deals with prediction of IBNR reserve under a different data ordering of the non-cumulative runoff triangle. The rows of the triangle are stacked, resulting in a univariate time series with several missing values.
2010
We study solvency of insurers in a practical model where in addition to basic insurance claims and premiums, economic factors like inflation, real growth and returns on the investments affect the capital developments of the companies. The objective is to give qualitative descriptions of risks by means of crude estimates for finite time ruin probabilities. In our setup, the economic factors have a dominant role in the estimates.
2010
A model is proposed using the run-off triangle of paid claims and also the numbers of reported claims (in a similar triangular array). These data are usually available, and allow the model proposed to be implemented in a large variety of situations. On the basis of these data, the stochastic model is built from detailed assumptions for individual claims, but then approximated using a compound Poisson framework.
2010
The separation method was introduced by Verbeek (1972) in order to forecast numbers of excess claims and it was developed further by Taylor (1977) to be applicable to the average claim cost.