Browse Research

Viewing 101 to 125 of 177 results
This paper studies an insurance model under the regulation that the insurance company has to reserve sufficient initial capital to ensure that ruin probability does not exceed the given quantity a. We prove the existence of the minimum initial capital. To illustrate our results, we give an example in approximating the minimum initial capital for exponential claims.
The current industry standard approach evaluates reinsurance effectiveness by calculating capital cost savings as the product of a fixed capital cost rate and the required capital which is released. Reinsurance is deemed value-creating if the resulting capital cost savings is more than the profit margin ceded to support the purchase-a Return On Risk-Adjusted Capital (RORAC) approach.
In their short paper, the authors describe an elegant decision rule for evaluating the attractiveness of potential reinsurance transactions. In effect, they propose comparing the premium quoted by reinsurers for a particular reinsurance structure to the portion of its premiums the ceding company would need to allocate, given its cost of capital, to retain the risk.
This paper presents a Bayesian technique for adjusting a mixed exponential severity distribution in response to partially-credible observed claim severities. It presents two applications: pricing excess of loss (XOL) reinsurance layers and computing increased limits factors (ILFs). The paper’s Bayesian model uses a Dirichlet distribution over the mixed exponential’s initial mixture weights.
This paper adopts the extreme value and VaR approach to investigate the amount of rice damaged due to extreme events and analyzes the collective risk model as a feasible scheme for estimating annual aggregate losses. The results show that the annual frequency of rice damage caused by typhoons is shown to fit well the Poisson distribution with one parameter.
Two recent papers by Dornheim and Brazauskas (2011a, 2011b) introduced a new likelihood-based approach for robust-efficient fitting of mixed linear models and showed that it possesses favorable large and small-sample properties which yield more accurate premiums when extreme outcomes are present in the data.
There is a dearth of public knowledge about the development patterns of mature workers compensation claims at the level of the aggregate loss triangle; this is because there are only a few loss triangles available for research that span the full lifetime of the cohort of claimants.
Predictive models are used by insurers for underwriting and ratemaking in personal lines insurance. Focusing on homeowners insurance, this paper examines many predictive generalized linear models, including those for pure premium (Tweedie), frequency (logistic) and severity (gamma). We compare predictions from models based on a single peril, or cause of loss, to those based on multiple perils.
After laying a fairly rigorous foundation for the mathematical treatment of excess losses, this paper shows that the excess-loss function is akin to the probability distribution of its loss. All the moments of the loss can be reclaimed from the excess-loss function, the variance being especially simple. Excess-loss mathematics is a powerful tool for pricing loss layers, as in reinsurance.
This paper presents a methodology for constructing a deterministic approximation to the distribution of the outputs produced by the loss development method (also known as the chain-ladder method). The approximation distribution produced by this methodology is designed to meet a preset error tolerance condition.
It is generally well established that new business produces higher loss and expense ratios and lower retention ratios than renewal business. Ironically, to add more new business, an insurer needs higher profitability in order to generate the additional capital needed to support its exposure growth. Irrational growth is one of the op reasons for the insolvencies of property and casualty insurance companies.
The aim of this paper is to analyze the impact of underwriting cycles on the risk and return of non-life insurance companies. We integrate underwriting cycles in a dynamic financial analysis framework using a stochastic process, specifically, the Ornstein-Uhlenbeck process, which is fitted to empirical data and used to analyze the impact of these cycles on risk and return.
The models of Mack (1993) and Murphy (1994) are expanded to a continuously indexed family of chain-ladder models by broadening the variance structure of the error term. It is shown that, subject to certain restrictions, an actuary’s selected report-to-report factor can be considered the best linear unbiased estimate for some member of this family.
Existing models of the market price of cat bonds are often too exotic or too simplistic; we present a model that is grounded in theory yet also tractable. We also intend for our analysis of cat bond pricing to shed light on broader issues relating to the theory of risk pricing.
To measure economic profits generated by an insurance policy during its lifetime, we compare the terminal assets of the policy account with certain break-even value. The break-even value is an increasing function of the claims risk and the asset investment risk. It can be calculated with closed-form formulas. We study policies with multiyear loss payments and tax payments.
For loss cost filings beginning in October 2009, NCCI implemented the largest set of changes in 40 years to the methodology used to determine class pure premiums in workers compensation.
Regression analysis is one of the most commonly used statistical methods. But in its basic form, ordinary least squares (OLS) is not suitable for actuarial applications because the relationships are often nonlinear and the probability distribution of the dependent variable may be non-normal.
NCCI changed its workers compensation ratemaking methodology to improve the treatment of large individual claims and catastrophic multiclaim events related to the perils of industrial accidents, earthquake, and terrorism. NCCI worked with a well known modeling firm to determine provisions for catastrophic events on a state basis. This paper describes the new methodology that NCCI has filed in many states.
In pricing excess of loss reinsurance, the traditional method for applying credibility is as a weighted average of two estimates of expected loss: one from experience rating and a second from exposure rating. This paper will show how this method can be improved by incorporating loss estimates from lower layers; producing a multifactor credibility-weighted estimate of expected loss.
Motivated by the empirical evidence of the long-range dependency found within the Greek motor insurance market, we formulate a particular stochastic pricing model in a continuous framework. We assume the structure of a competitive insurance market where the business volume of each company is directly related to the existing relativity between the company’s premium and the market’s average premium.
New models for panel data that consist of a generalization of the hurdle model are presented and are applied to modeling a panel of claim counts. Correlated random effects are assumed for the two processes involved to allow for dependence among all the contracts held by the same insured. A method to obtain a posteriori distribution of the random effects as well as predictive distributions of the number of claims is presented.
The risk benchmarks and underwriting cycle models presented here can be used by insurers in their enterprise risk management models. We analyze the historical underwriting cycle and develop a regime-switching model for simulating future cycles, and show its superiority to an utoregressive approach.
Correlations of future observations are investigated within the recursive and non-recursive chain-ladder models. The recursive models considered are the Mack and over-dispersed Poisson (ODP) Mack models; the non- ecursive models are the ODP cross-classified models. istinct similarities are found between the correlations ithin the recursive and non-recursive models, but distinct differences also emerge.
Risk valuation is the process of assigning a monetary value to a transformation of risk. Risk transformation can come about through changes in the operation of a business, explicit risk transfer mechanisms, financial changes, etc.
Capital allocation is a theoretical exercise, since all of a firm’s capital could be depleted to cover a significant loss arising from any one segment. However, firms do need to allocate capital for pricing, risk management, and performance evaluation. One versatile allocation method, the Ruhm-Mango-Kreps algorithm, has several key advantages: additivity, simplicity, and flexibility.