1. Posts in the subject areas are now being moderated. Please do not post any details about your exam for at least 3 working days. You may not see your post appear for a day or two. See the 'Forum help' thread entitled 'Using forums during exam period' for further information. Wishing you the best of luck with your exams.
    Dismiss Notice

GLMs - prior weights, single risk premium model

Discussion in 'SP8' started by howard, Sep 17, 2019.

  1. howard

    howard Active Member

    Hi all
    1) The GLMs chapter mentions prior weights (e.g. 1 for claim numbers, exposure for claim frequency).
    What is the meaning of prior weights and how are they used in modelling?
    2) What is a 'single risk premium model'?
    Many thanks
     
  2. Katherine Young

    Katherine Young ActEd Tutor Staff Member

    1) Imagine you have a 10 policyholders aged over 90 but 10000 policyholders aged between 20 and 30. You want your model output to be influenced more by the 10000 policies than by the 10 policies. The weights allow you to do this.
    2) You would combine your frequency and severity models to get an estimated risk premium. (In other words, frequency * severity = burning cost.)
     
  3. howard

    howard Active Member

    Re your answer to question 1, what are you measuring in your scenario? How would you use weights to do this?
    Many thanks
     
  4. Katherine Young

    Katherine Young ActEd Tutor Staff Member

    The prior weights allow information about the known credibility of each observation to be incorporated in the model. For example, if modeling claims frequency, one observation might relate to one month's exposure, and another to one year's exposure. There is more information and less variability in the observation relating to the longer exposure period, and this can be incorporated in the model by defining ωi to be the exposure of each observation. In this way observations with higher exposure are deemed to have lower variance, and the model will consequently be more influenced by these observations.

    This is quite technical Howard. For more information, see the original source of the Core Reading, https://www.casact.org/pubs/dpp/dpp04/04dpp1.pdf
     

Share This Page