Browse Research

Viewing 2576 to 2600 of 7695 results
2004
A multi-level factor (MLF) is a rating factor with a large number of levels, each of which we want to rate separately even though many of them do not have a sufficient amount of data. Examples include Car model, Geographic Zone and Company (in experience rating). Rating of MLFs is a standard situation for employing credibility theory.
2004
This paper provides a case study in the application of generalised linear models (“GLMs”) to loss reserving. The study is motivated by approaching the exercise from the viewpoint of an actuary with a predisposition to the application of the chain ladder (“CL”). The data set under study is seen to violate the conditions for application of the CL in a number of ways.
2004
The insurance market has long been subject to pricing cycles. During the so-called ‘soft market’ pricing may produce breakeven profitability results or even operating losses for some companies. This is then followed by a ‘hard market’ where insurance prices are relatively high. Then pricing falls and a soft market slowly erodes profits, continuing the cycle.
2004
In this paper we examine and summarize properties of several well-known risk measures, with special attention given to the class of distortion risk measures. We investigate the relationship between these risk measures and theories of choice under risk. We also consider the problem of evaluating risk measures for sums of nonindependent random variables and propose approximations based on the concept of comonotonicity.
2004
The Casualty Actuarial Society (CAS) is proud to present the results of two commissioned analyses showing the impact of fair value concepts applied to property/casualty insurance companies. Last fall, the CAS sent out a request for proposal (RFP) to selected consulting firms, seeking research on the impact of the following fair value concepts on property/casualty insurance company financial statements:
2004
In this paper the probability of ruin is investigated under the influence of a premium rate which varies according to the intensity of claims, and the occurrence of claims is described by a Cox process in the considered risk model. The idea is originally enlightened by Jasiulewicz (2001). We make a slight modification on the model and a generalization on the intensity process.
2004
The main tools and concepts of financial and actuarial theory are designed to handle standard, or even small risks. The aim of this paper is to reconsider some selected financial problems, in a setup including infrequent extreme risks.
2004
Using the language of copulas, we generalize the famous Fisher-Tippett Theorem of extreme value theory to the case with sequences of dependent random variables. The dependence structure is modelled using archimedean copulas. This generalization enables to study the behaviour of the maxima of dependent random sequences. Keywords: Archimedean Copula, Dependent Risks, Extreme Value Theory, Maximum Domain Of Attraction, Fisher-Tippett Theorem
2004
We consider a stochastic risk reserve process whose risk exposure can be controlled dynamically by applying proportional reinsurance and by issuing CAT Bonds. The CAT Bond payments are only partly correlated with the insurers losses. The aim is to minimize the probability of ruin. Using a two-dimensional diffusion approximation we obtain a controlled diffusion problem which can be solved explicitly with the help of the HJB equation.
2004
The insurance process is complex, with numerous factors combining to produce both premiums and losses. When compiling rates, actuaries often aggregate data from more than one source, while at the same time stratifying the data to achieve homogeneity.
2004
During the Spring 2004 NAIC meeting, it was announced that Colorado has become the first state to adopt the updated Interstate Insurance Product Regulation Compact developed by the NAIC. Several other states have the compact before their legislatures and the goal is to have at least 30 states ratify the compact by 2008.
2004
William Wells provides readers with a non-technical beginner's guide to the event study methodology, which is often used to study the impact of regulation on the insurance industry. Event studies have been the focus of several recent articles in the JIR, including an article that appears in this edition.
2004
The insurance industry has become more important for systemic financial stability. As a result, the supervision and disclosure of financial risks of insurance companies need to be strengthened. The insurance industry's increasing systemic importance also suggests that there is a need to search for some middle ground in the discussion of fair value accounting to mitigate potentially destabilizing financial volatility.
2004
My job is to contribute briefly to the considerations surrounding the insurability of new liability risks. I propose to divide the topic into two questions, namely, "What are new liability risks?" and "What can we say about their insurability?" As far as the first question is concerned, I am not going to speculate about the effects of future technologies.
2004
This paper investigates the ultimate ruin probability of a discrete time risk model with a positive constant interest rate. Under the assumption that the gross loss of the company within one year is subexponentially distributed, a simple asymptotic relation for the ruin probability is derived and compared to existing results. Keywords: Asymptotics, Constant Interest Rate, Matuszewska Index, Ruin Probability, Subexponentiality
2004
Properties of the distribution of the deficit at ruin in the stationary renewal risk model are studied. A mixture representation for the conditional distribution of the deficit at ruin (given that ruin occurs) is derived, as well as a stochastic decomposition involving the residual lifetime associated with the maximal aggregate loss.
2004
This paper deals with the severity of ruin in a discrete semi-Markov risk model. It is shown that the work of Reinhard and Snoussi (Stochastic Models, 18) can be extended to cover the case where the premium is an integer value and no restriction on the annual result is imposed. In particular, it is shown that the severity of ruin without initial surplus is solution of a system of equations.
2004
Non-traditional reinsurance contracts, and finite risk reinsurance contracts in particular, are structured differently from traditional reinsurance. The incorporation of special features that make each contract unique tends to preclude standard portfolio loss reserving.
2004
In order to reveal and better understand the inner workings of insurance credit scoring models used by the vast majority of personal lines insurers, the authors obtained nine private passenger automobile and two homeowners’ filings from nine insurance groups from the Virginia Bureau of Insurance.
2004
This paper develops a risk pricing procedure by examining the role of capital in an insurance transaction. An insurance transaction differs from an investment in that an insurer uses capital at the time a claim is settled rather than when the policy is issued, and only if the damages exceed the premium for the exposure.
2004
The investment income received by a property-casualty company can be a prime component in its pricing and decision to write some lines of business that generate underwriting losses. In times of high interest rates it can enable the insurer to write during soft markets and to gain market share by taking on previously uninsurable risks.
2004
This paper deals with theoretical and practical pricing of non-life insurance contracts within a financial option pricing context. The market based assumption approach of the option context fits well into the practical nature of non-life insurance pricing and valuation. Basic facts in most insurance markets like the existence of quite different insurer price offers on the same claims risk in the same market, support the need for this approach.
2004
This paper shows how expert opinion can be inserted into a stochastic framework for claims reserving. The reserving methods used are the chain-ladder technique and Bornhuetter-Ferguson, and the stochastic framework follows England and Verrall (2062). Although stochastic models have been studied, there are 2 main obstacles to their more frequent use in practice: ease of implementation and adaptability to user needs.