log likelyhoods

Discussion in 'CT6' started by withoutapaddle, May 22, 2006.

  1. does anyone know about these log liklyhood things, chapter 10? goes on about
    E[(lnL)']=0, E[(lnL)''] = -E[(lnL)^2]

    what does this mean, i know that the cramer lower bound is connected to these things, and so is the fischer information matrix, and ive heard somewhere that entropy/surprise/information theory is also related?

    the chapter quotes some book to do with generalised linear models (GLM's), has anyone read it, will it answer my questions?

    i feel like i definetly have to go beyond the core reading wrt GLM's because im really not sure about deciding between various models and significance of parameters. any suggestions on good books?
     
  2. 89 views and no replies -lame!
    surely someone has taken stats before i.e. fischer information theorem.
    or physics i.e entropy and boltzman's law.
    i took pure math, so dont know anything :D
     
  3. Dha

    Dha Member

    I was rather clueless about GLMs when I was doing CT6, it was by far the hardest part of the course. But the ActEd tutorial really saved me. They're not that hard once someone explains it to you. Ironically, I think it was the GLM question that helped me scrape a pass.

    So basically, I can't answer your question, but I wouldn't worry about that chaper. Just book a tutorial!
     

Share This Page