Browse Research

Viewing 1151 to 1175 of 7690 results
2011
New models for panel data that consist of a generalization of the hurdle model are presented and are applied to modeling a panel of claim counts. Correlated random effects are assumed for the two processes involved to allow for dependence among all the contracts held by the same insured. A method to obtain a posteriori distribution of the random effects as well as predictive distributions of the number of claims is presented.
2011
The risk benchmarks and underwriting cycle models presented here can be used by insurers in their enterprise risk management models. We analyze the historical underwriting cycle and develop a regime-switching model for simulating future cycles, and show its superiority to an utoregressive approach.
2011
Correlations of future observations are investigated within the recursive and non-recursive chain-ladder models. The recursive models considered are the Mack and over-dispersed Poisson (ODP) Mack models; the non- ecursive models are the ODP cross-classified models. istinct similarities are found between the correlations ithin the recursive and non-recursive models, but distinct differences also emerge.
2011
Risk valuation is the process of assigning a monetary value to a transformation of risk. Risk transformation can come about through changes in the operation of a business, explicit risk transfer mechanisms, financial changes, etc.
2011
Capital allocation is a theoretical exercise, since all of a firm’s capital could be depleted to cover a significant loss arising from any one segment. However, firms do need to allocate capital for pricing, risk management, and performance evaluation. One versatile allocation method, the Ruhm-Mango-Kreps algorithm, has several key advantages: additivity, simplicity, and flexibility.
2011
We consider Tweedie’s compound Poisson model in a claims reserving triangle in a generalized linear model framework. We show that there exist practical situations where the variance, as well as the mean of the costs, needs to be modeled. We optimize the likelihood function through either direct optimization or through double generalized linear models (DGLM).
2011
It has become common to use historical data as a guide in analyzing future risks. However, the statistical tools used often are based on the assumption that the data (regardless of the source) may be treated as independent data for risk analysis purposes. In some cases, the data is conditional in nature, and the proper tool needs to be one that reflects this characteristic of the data.
2011
Motivation: Pricing legislative changes is an integral part of NCCI ratemaking.
2011
Motivation: There is increasing evidence that obesity contributes to the cost of medical care in workers compensation, and that this contribution is significant in magnitude. For instance, a recent study of workers compensation claims of Duke University employees shows that, for the morbidly obese, the medical costs per 100 full-time equivalent employees are nearly seven times as high as for employees of recommended weight.
2011
In pricing excess of loss reinsurance, the traditional method for applying credibility is as a weighted average of two estimates of expected loss: one from experience rating and a second from exposure rating.
2011
The article tests the hypothesis that insurance price subsidies created by rate regulation lead to higher insurance cost growth. The article makes use of data from the Massachusetts private passenger automobile insurancemarket, where cross-subsidies were explicitly built into the rate structure through rules that limit rate differentials and differences in rate increases across driver rating categories.
2011
Regression models for limited continuous dependent variables having a non-negligible probability of attaining exactly their limits are presented. The models differ in the number of parameters and in their flexibility. Fractional data being a special case of limited dependent data, the models also apply to variables that are a fraction or a proportion.
2011
In the framework of the classical compound Poisson process in collective risk theory, we study a modification of the horizontal dividend barrier strategy by introducing random observation times at which dividends can be paid and ruin can be observed.
2011
The dual model with diffusion is appropriate for companies with continuous expenses that are offset by stochastic and irregular gains. Examples include research-based or commission-based companies. In this context, Avanzi and Gerber (2008) showed how to determine the expected present value of dividends, if a barrier strategy is followed.
2011
In this paper we investigate the potential of Lévy copulas as a tool for modelling dependence between compound Poisson processes and their applications in insurance. We analyse characteristics regarding the dependence in frequency and dependence in severity allowed by various Lévy copula models.
2011
It is known that the partial stop-loss contract is an optimal reinsurance form under the VaR risk measure.
2011
We analyze different types of guaranteed withdrawal benefits for life, the latest guarantee feature within variable annuities. Besides an analysis of the impact of different product features on the clients’ payoff profile, we focus on pricing and hedging of the guarantees. In particular, we investigate the impact of stochastic equity volatility on pricing and hedging.
2011
In this paper, we study two classes of optimal reinsurance models by minimizing the total risk exposure of an insurer under the criteria of value at risk (VaR) and conditional value at risk (CVaR). We assume that the reinsurance premium is calculated according to the expected value principle. Explicit solutions for the optimal reinsurance policies are derived over ceded loss functions with increasing degrees of generality.
2011
Modeling dependencies among multiple loss triangles has important implications for the determination of loss reserves, a critical element of risk management and capital allocation practices of property-casualty insurers. In this article, we propose a copula regression model for dependent lines of business that can be used to predict unpaid losses and hence determine loss reserves.
2011
In this paper, we study the fair valuation of participating life insurance contract, which is one of the most common life insurance products, under the jump diffusion model with the consideration of default risk. The participating life insurance contracts considered here can be expressed as portfolios of options as shown by Grosen and Jørgensen (1997). We use the Laplace transforms methods to price these options.
2011
The forecasting of the future mortality of the very old presents additional challenges since data quality can be poor at such ages. We consider a two-factor model for stochastic mortality, proposed by Cairns, Blake and Dowd, which is particularly well suited to forecasting at very high ages. We consider an extension to their model which improves fit and also allows forecasting at these high ages.
2011
The mortality evolution of small populations often exhibits substantial variability and irregular improvement patterns making it hard to identify underlying trends and produce plausible projections. We propose a methodology for robust forecasting based on the existence of a larger reference population sharing the same long-term trend as the population of interest.
2011
We quantify the overall impact of genetic information on the insurance industry using the ‘bottom-up’ approach, in which detailed models are constructed of representative major genetic disorders. We consider six such disorders, namely adult polycystic kidney disease, early-onset Alzheimer’s disease, Huntington’s disease, myotonic dystrophy (MD), hereditary non-polyposis colorectal cancer; and breast/ovarian cancer.
2011
This paper investigates market-consistent valuation of insurance liabilities in the context of Solvency II among others and to some extent IFRS 4. We propose an explicit and consistent framework for the valuation of insurance liabilities which incorporates the Solvency II approach as a special case.
2011
In this paper the catastrophe bond prices, as determined by the market, are analysed. The limited published work in this area has been carried out mainly by cat bond investors and is based either on intuition, or on simple linear regression on one factor or on comparisons of the prices of cat bonds with similar features. In this paper a Generalised Additive Model is fitted to the market data.