Browse Research

Viewing 5851 to 5875 of 7695 results
1980
I looked forward to reviewing this paper because, first, 1 was intrigued by the title (I couldn't understand how closed claim survey data could possibly be used for insurance pricing) and also because I thought I might learn something about closed claim surveys.
1980
A fundamental problem of pricing insurance is: When all is known about claims from an accident-or policy-year, that year is too old to be relevant for next year's coverage. Thus, our ancestors began using aggregate historical patterns to estimate how incurred costs of recent periods would mature to full ultimate value.
1980
Parameter estimation in case of loglinear modelled claim cost distribution characteristics is mathematically tractable, especially with the Inverse Gaussian and Lognormal distribution.
1980
In a recent paper SEAL (1980) calculated numerically survival probabilities based on Pareto claim distributions.
1980
It is commonly thought that the characteristic function (Fourier transform) of the Pareto distribution has no known functional form (e.g. SEAL, 1978, PP. 14, 4 ° , 57)- This is quite untrue. Nevertheless the characteristic function of the Pareto density is conspicuously absent from standard reference works even when the Pareto distribution itself receives substantial comment (e.g. HAIGHT, 1961 ; JOHNSON and KOTZ, 1970, Ch.
1980
We consider a usual situation in risk theory for which the arrival process is a Poisson process and the claim process a positive (J - x ) process inducing a semi-Markov process.
1980
Reinsurance Research - General/NOC
1980
Reinsurance Research - General/NOC
1980
My compliments go to John Kollar for the careful deliberation given and the time spent in the active role of providing us with a paper for discussion. In all honesty, however, I expected a paper much different in scope. When I was asked to review this paper I expected to receive a recipe guide for beginning student in my office to use and read before he or she began asking the imponderable questions that I will never be able to answer.
1980
As with any other line of insurance, the ratemaker's goal is to develop rates that will cover losses and expenses (including underwriting profit) arising from policies in force during a specified future period. In order to accomplish this goal, a proper match between premiums (or exposures) and losses plus expenses must be established.
1980
My initial impression of Mr. Karlinski's paper was that it is a sales piece aimed toward fuller utilization of the actuary's training and skills in the pricing decision process. Subsequent readings confirm that impression.
1980
General/Profit Factor/Rate of Return/Risk
1980
This is an interesting paper. It presents a progress report on the analytical approach one large reinsurer is developing toward the pricing of excess casualty coverage. The approach is an analytical one, in that pricing decisions are made on the basis of information generated by a theoretical pure premium distribution fitted to sample data. Claim Size Modeling/Loss Distribution
1980
An excess-of-loss reinsurance treaty provides the primary insurance company (cedant) with reinsurance protection covering a certain layer of loss for a specified category of individual (direct) insurance policies. Hence, for each loss event (occurrence) coming within the terms of the treaty, the reinsurer reimburses the cedant for the dollars of loss in excess of a certain fixed retention up to some maximum amount of liability per occurrence.
1980
This short note has as its starting point an interesting article by Taylor in which he considered the effects of inflation on a risk process. Taylor showed that if the premium density increased at the same rate as the cost of individual claims then, under certain conditions, ultimate ruin was certain.
1980
Maximum likelihood estimation in case of a Poisson or Gamma distribution with loglinear parameterization for the mean is quite akin. The asymptotic variance-covariance matrix for the maximum likelihood estimator is derived as well as a linear estimator, which can serve as a starting value for the nonlinear search procedure.
1980
I, too, have always been intrigued by actuarial theory put into practice to solve rating problems. Certainly the body of the Hewitt/Lefkowitz paper deals primarily with the practical manipulation of a fitted loss distribution’s cumulative function for the purposes of determining deductible discounts, increased limits factors, and relative frequency and severity.
1980
Mr. Khury has made a positive contribution to the expanding effort by casualty actuaries to replace over-emphasis on "seasoned judgment" by scientific method, and for this effort he must be congratulated! His paper sets forth an approach for documentation of the actuaries' assumptions as to frequency, severity, inflation, payout patters and the time value of money in an explicit manner.
1980
Loss reserves have a significant impact on the reported operating results as well as on the financial condition of an insurer. Actuarial literature to date has focused on developing loss reserving methods [l]. The matter of assessing the condition [2] of loss reserves, on the other hand, has received relatively little attention.
1980
Mr. Van Slyke's paper presents a discussion of econometric modeling in a fairly general way. I would have preferred to see more on possible specific applications to insurance pricing, especially with regard to the more sophisticated techniques of systems dynamics and catastrophe theory. Econometric Modeling
1980
Econometric models are widely used to forecast economic events. A number of macroeconometric models are well known, including those of the Wharton School, Chase Econometrics, Data Resources, Inc., and the Federal Reserve Bank of St. Louis. Less imposing models are cropping up in all walks of life. The Insurance Services Office (ISO) is studying the application of econometric models to actuarial problems.