Browse Research

Viewing 2001 to 2025 of 7695 results
2007
This purpose of this study is to compare the results of several risk allocation methods for a realistic insurance company example. The basis for the study is the fitted loss distributions of Bohra and Weist (2001), which were derived from the hypothetical data for DFA Insurance Company (DFAIC).
2007
Extended service contracts and their programs continue to evolve and expand to cover more and more products. This paper is intended to be a basic primer for the actuary or risk professional interested in either working in or understanding this area. We discuss the general structure of service contract programs and highlight features that should be considered in the review of the financial solidity of such programs.
2007
This paper summarizes key results from the Report of the Casualty Actuarial Society (CAS) Research Working Party on Risk Transfer Testing. The Working Party defined and described a structured process of elimination to narrow down the field of reinsurance contracts that have to be tested for risk transfer.
2007
This paper applies a bivariate lognormal distribution to price a property policy with property damage and business interruption cover subject to an attachment point, separate deductibles, and a combined limit. Curve-fitting tasks for univariate probability distributions are compared with the tasks required for multivariate probability distributions. This is followed by a brief discussion of the data used, data-related issues, and adjustments.
2007
This paper demonstrates a Bayesian method for estimating the distribution of future loss payments of individual insurers.
2007
Over the past twenty years many actuaries have claimed and argued that the chain-ladder method of loss reserving is biased; nonetheless, the chain-ladder method remains the favorite tool of reserving actuaries. Nearly everyone who acknowledges this bias believes it to be upward. Although supporting these claims and beliefs, the author proposes herein to deal with two deeper issues.
2007
This paper presents a framework for stochastically modeling the path of the ultimate loss ratio estimate through time from the inception of exposure to the payment of all claims. The framework is illustrated using Hayne’s lognormal loss development model, but the approach can be used with other stochastic loss development models.
2007
The tax shields from debt financing reduce the cost of operations for firms with low cost of bankruptcy. State regulation prevents insurers from using long-term debt as statutory surplus, to ensure sufficient equity capital to meet policyholder obligations. Constraints on regulatory capital force policyholders to fund high tax costs on insurers and reduce the market forces that support solvency.
2007
While accounting principles and actuarial standards of practice are all well designed, they provide only broad guidance to the actuary on what is “reasonable.” This broad guidance is based on the principle that “reasonable” assumptions and methods lead to “reasonable” estimates.
2007
This paper applies the exponential dispersion family with its associate conjugates to the claims reserving problem. This leads to a formula for the claims reserves that is equivalent to applying credibility weights to the chain-ladder reserves and Bornhuetter-Ferguson reserves.  
2007
The purpose of this study note is to educate actuaries on certain basic insurance accounting topics that may be omitted in other syllabus readings. These topics include: • Loss and loss adjustment expense accounting basics • Reinsurance accounting basics • Examples of how ceded reinsurance impacts an insurers financial statements • Deposit accounting basics
2007
Written from a global perspective on risk, hazards, and disasters, Introduction to International Disaster Management provides practitioners, educators and students with a comprehensive overview of the players, processes and special issues involved in the management of large-scale natural and technological disasters.
2007
The most important new development in the past two decades in the personal lines of insurance may well be the use of an individual's credit history as a classification and rating variable to predict losses.
2007
This paper shows how expert opinion can be inserted into a stochastic framework for loss reserving. The reserving methods used are the chain-ladder and Bornhuetter-Ferguson, and the stochastic framework follows England and Verrall [8]. Although stochastic models have been studied, there are two main obstacles to their more frequent use in practice: ease of implementation and adaptability to user needs.
2007
This paper discusses an approach to the correlation problem in which losses from different lines of insurance are linked by a common variation (or shock) in the parameters of each line’s loss model. The paper begins with a simple common shock model and graphically illustrates the effect of the magnitude of the shocks on correlation.
2007
Although the copula literature has many instances of bivariate copulas, once more than two variates are correlated, the choice of copulas often comes down to selection of the degrees-of-freedom parameter in the t-copula. In search for a wider selection of multivariate copulas we review a generalization of the t-copula and some copulas defined by Harry Joe. Generalizing the t-copula gives more flexibility in setting tail behavior.
2007
In applications of the collective risk model, significantly more attention is often given to modelling severity than modelling frequency. Sometimes, Frequency modelling is neglected to the extent of using a Poisson distribution for the number of claims. The Poisson distribution has variance equal to mean, and there are multiple reasons why this is almost never appropriate when forecasting numbers of non-life insurance claims.
2007
Data and quality data are an ever critical part of our lives today - both personally and corporately - yet it is not a subject that many actuaries focus on. But, every business operation creates or consumes huge quantities of data. Data Quality, the Field Guide, by Thomas C. Redman, Ph.D. provides many practical approaches to establishing or improving the data quality programs in businesses.
2007
In this paper we explore the bias in the estimation of the Value at Risk and Conditional Tail Expectation risk measures using Monte Carlo simulation. We assess the use of bootstrap techniques to correct the bias for a number of different examples.
2007
Motivation: Capital allocation can have substantial ramifications upon measuring risk adjusted profitability as well as setting risk loads for pricing.
2007
Enterprise risk management (ERM) is the process of analyzing the portfolio of risks facing the enterprise to ensure that the combined effect of such risks is within an acceptable tolerance. While more firms are adopting ERM, little academic research exists about the costs and benefits of ERM.
2007
Simultaneous modelling of operational risks occurring in different event type/business line cells poses the challenge for operational risk quantification. Invoking the new concept of Lévy copulas for dependence modelling yields simple approximations of high quality for multivariate operational VAR.
2007
Decisions taken in order to achieve planned targets are always connected with risk that influences the resources of the company—both in positive and negative ways. It should be considered through a capital perspective.
2007
Value-at-risk (VaR) is a widely used risk measure among financial institutions. Cash-flow-at-risk (CFaR) is an attempt to transfer the same ideas to the setting of a non?financial firm.