Hello
As the chain is Markov, this means that the future transition probabilities only depend on the current state, not the past of the process. That isn't to say that we're not concerned with the past of the process at all. To work out the probability of a particular series of jumps occurring, we have to consider every jump, not just the ones at the end.
Although the solutions don't necessarily present it in this manner, we do consider the full path of the process for k=3,4,5 and 6. For example, when k=3, the possibilities are:
LLL
The probability of this is 0.2^3
when k = 4, the possibilities are:
WLLL
the probability of this is 0.8*0.2^3
When k = 5, the possibilities are:
WWLLL
LWLLL
The probabilities of these are 0.8^2*0.2^3 and 0.8*0.2^4 respective. The sum of these is:
0.8^2 + 0.2^3 + 0.8 * 0.2^4 = 0.8*0.2^3 * (0.8 + 0.2) = 0.8*0.2^3
This happens to be the probability of WLLL (and hence possibly the confusion). This is because although we need the path to end in WLLL, for k = 5 it doesn't matter what comes before it - we can have either W or L as the first outcome.
For k = 6, the possibilities are:
WWWLLL
WLWLLL
LWWLLL
LLWLLL
Again, although we need WLLL at the end of 6 games, the first two results don't matter (it can be any of the four possible combinations of outcomes for two games). So the probability is again just the probability of WLLL. You can check this using the actual numbers like I did for k=5 and adding up all the probabilities.
Now, for k = 7 we need WLLL at the end of 7 games. So we have three results to think about before the WLLL. However, this time the results before WLLL DO matter. In particular, we can't have LLLWLLL, as this would mean the coach was fired after 3 weeks, not after 7 weeks. So we need to take this into account.
Hope this helps
Andy