Browse Research
Viewing 5876 to 5900 of 7695 results
1980
Reinsurance Research - General/NOC
1980
Profit Factor/Rate of Return/Risk/Solvency
1980
Mr. Habeck's timely article presents a clear view of the impact of regulation on individual health insurance practices and policies which has heightened in recent years as a result of perceived and/or imagined shortcomings in the industry by consumer groups, legislators and regulators.
1980
State regulation of individual health insurance has increased greatly in recent years, both in scope and intensity. The need to comply with regulations has become the dominant objective in benefit design and pricing of individual health contracts. This shift away from the dominance of market forces results from the extension of regulation to almost every aspect of the development and marketing processes.
LOB-Health, Regulation
1980
In the fourteen years since Jeffrey T. Lange wrote "General Liability Insurance Ratemaking" the insurance industry has experience a period of significant social and economic inflation. This has been evidenced by spiraling insurance claim costs, as well as by rapidly growing number of claims, brought by an increasingly claims conscious public.
1980
Mr. Stanard's paper opens a new area of actuarial research, namely the use of simulation to investigate the reliability of commonly used pricing (and related) models. He is not using simulation to forecast insurance results directly, but rather to determine how well a given technique for such forecasting can be expected to perform.
1980
Using an individual insured's own past loss experience to arrive at its rate is a procedure that is used in many different areas of insurance.
1980
The authors are to be commended for their willingness to address as controversial a subject as expense allocation. Their approach provides one with a basic introduction to the subject. This reviewer, with a limited experience with the subject, feels that a few general comments are in order.
Ex/Ind. Risk Rating Plans
1980
Until the present time, the great majority of actuarial study and literature in the ratemaking area has revolved around analyzing and quantifying the loss component of the insurance rate. Actuaries have evolved an elaborate system in which losses are trended, developed and credibility weighted, and in which premiums are placed at current rates or at least current rate levels.
1980
It is often necessary to estimate probability distributions to describe the loss processes covered by insurance contracts. For example, in order that the premium charged for a particular contract be correct according to any reasonable premium calculation principle, it must be based upon the underlying loss process for the contract.
1980
It is demonstrated that the problems of balancing a reinsurance network and finding the maximum flow in a graph are identical. Gale's theorem is applied first in order to prove a conjecture of Sousselier concerning simple first order networks, next to extend those results to any network. The balanced reinsurance scheme can effectively be constructed by means of Ford and Fulkerson's algorithm, as is shown by an example.
1980
Aggregate loss probability is an effective tool in actuarial rate making, risk charging, and retention analysis for both primary and secondary insurance companies. A noticeable trend over recent years indicates that it also is becoming an indispensable element in the risk management operations of many manufacturing and commercial firms.
1980
The purpose of this paper is to add another chapter to the fund of knowledge being accumulated on loss reserving techniques.
1980
Much work has been done in the past few years on the applications of Bayesian credibility to insurance pricing. This work has been born of necessity due to the failure of "classical" credibility theories. Recent work by Buhlman and Straub as well as Morris & Van Slyke incorporate the Bayesian concept of utilizing as much information available from historical data as possible in predicting behavior for a segment of a population.
1980
An important fact brought into sharp focus by the papers submitted to last year's program is that a healthy insurance enterprise not only must produce adequate earnings but must do so steadily and predictably. That is to say: the risks that confront the enterprise must be tightly controlled. Management must steer a course mindful not only of reasonable expectations but also of unforeseen deviations there from.
1980
Analytical steps towards a numerical calculation of the ruin probability for a finite period when the risk process is of the Poisson type or of the more general type studied by Sparre Andersen, Astin Bulletin vol. 6, 54-65, by OLOF THORIN, Stockholm.
On pp. 56-57 of the above-mentioned paper there is a digression about K(t), the distribution function for the time between successive claims.
1980
Reinsurance Research - Reserving
1980
As most actuaries that have had an opportunity to prepare a rate filing will tell you, the ratemaker will generally have to convince three principals that the rate he is generating is a reasonable one. First, he must convince himself. This first step alone is, in many cases, a difficult and laborious task due to the many technical uncertainties with which an actuary must deal.
1980
A recent article in the Journal of Commerce cited an address given at the convention of The National Association of Casualty & Surety Agents by its President, Mr. John S. Childross, Vice President of Marsh & McLennan. His remarks emphasized the value of educating the public regarding casualty ratemaking procedures and company needs.
General/Premium Analysis/Regulation
1980
Insurance pricing is a combination of rating classification and underwriting selection. The author defines necessary and sufficient standards for rating classification which are not met by underwriting selection or insurance pricing as a whole. It is unreasonable to assert that it is necessary for rating classifications to meet certain standards when the pricing structure as a whole does not meet those same standards.
1980
The escalating inflation of the past decade spawned complaints about more than just overall insurance rate increases. Unlike most other products, insurance costs depend upon buyer characteristics, so questions of fairness have naturally arisen as some insureds were confronted with four digit auto insurance prices along with double digit inflation.
1980
This paper examines the economic consequences of allocating common costs by (1) gross revenues, (2) directly attributable costs, and (3) relative output levels (such as ton-miles) to determine fully distributed cost prices for regulated firms.
1980
A premium calculation principle is a functional assigning to a random variable (or its distribution function) a real number. The interpretation is rather obvious. The random variable stands for the possible claims of a risk whereas the resulting number is the premium charged for assuming this risk. Of course, in economics premiums are not only depending on the risk but also on market conditions.
1980
This paper presents an experimental investigation of risk taking in the domain of losses. The results are partly compatible with expected utility theory, assuming an inflection point in the utility function over losses. However, overweighting of low probabilities and underweighting of high ones was observed, which runs counter to the expected utility model.