1. Posts in the subject areas are now being moderated. Please do not post any details about your exam for at least 3 working days. You may not see your post appear for a day or two. See the 'Forum help' thread entitled 'Using forums during exam period' for further information. Wishing you the best of luck with your exams.
    Dismiss Notice

Time Series -Chap13: "Autoregressive Model more convenient than Moving Average"

Discussion in 'CS2' started by Bill SD, Apr 5, 2022.

  1. Bill SD

    Bill SD Very Active Member

    Hi,
    Quick question: The Core Reading for Time Series (Chapter 13, pg 37 in 2019 version) says: "In many circumstances an autoregressive model is more convenient than a moving average model."

    Why is this the case and in what circumstances would it apply? Is it because an autoregressive processes only has one white noise/ error term and will always be invertible (which is good for statistical packages). Or is it because it is more common for a time series to have the features of an autoregressive processes -ie. an ACF which decays geometrically and a PACF which cuts off for k>p.

    Thanks in advance!
     
  2. John Lee

    John Lee ActEd Tutor Staff Member

    I believe, though I could be wrong, that it is more convenient because it depends on past observable values - whereas the MA doesn't have that explicit connection.
     
  3. Bill SD

    Bill SD Very Active Member

    Thanks John -why does a dependence on past values make it 'convenient' -because its easier to calculate and update over time?

    Appreciate you didn't write it and suppose not so relevant for exams (so less urgent than other ppls questions).
     
    Last edited: Apr 11, 2022
  4. Sunil Chaudhary

    Sunil Chaudhary Active Member

    Hi,
    For autoregressive model, can you please mention what mu stands for.
    Also, AR(p) and as per proof of result 13.2 on page 29, for autocovariance function the result is shown for k>=p. Any reason for this condition.
    Thanks.
    Sunil
     
  5. Sunil Chaudhary

    Sunil Chaudhary Active Member

    Hi,
    Can anyone please comment on above.

    Thanks.
     
  6. Andrew Martin

    Andrew Martin ActEd Tutor Staff Member

    Hi Sunil

    If we have an AR(p) process written as:

    \( X_t = \mu + \alpha_1 * (X_{t-1} - \mu) + ... + \alpha_p * (X_{t-p} - \mu) + e_t \)

    Then \( \mu \) is the mean of the process.

    Regarding page 29, although technically the result holds true for k < p, the reason we consider k >= p is to get the structure for a p-order difference equation, ie of the form:

    \( y_p = a_1 y_{p-1} + a_2 y_{p-2} + ... + a_p y_0 \)

    Hope this helps!

    Andy
     
  7. Sunil Chaudhary

    Sunil Chaudhary Active Member

    Thanks Andy.

    Sunil
     

Share This Page