The difference is about 1%, which can be insignificant but I'm just wondering why the two methods produced slightly different numbers, when doubling the initial rates is the same as doubling the deaths.
To answer your question, doubling the deaths is not the same as doubling the initial rates. This is because if you increase the death rates, you leave less people to die at later ages. I said this before and am not sure how I can say it differently so I'll try an example
Say you have 1000000 people aged 65 exact and the base mortality rates are 1%.
At age 66 you are left with 990000 people, ie 10000 deaths
At age 67 you are left with 980100 people, ie 9900 deaths
If you double the mortality rates to 2%.
At age 66 you are left with 980000 people, ie 20000 deaths
At age 67 you are left with 960400 people, ie 19600 deaths
The latter number is not twice as the number under standard rates.
Put another way, you are increasing the chance of dying now, which means that you are decreasing the chance of dying later (as chance of dying at some time sums to 1). Ie, you can't simply increases deaths at all ages across the board.
Devon, you seem to understand the finer issues. It is exactly right for one period, and I agree that it is a reasonably close approximation (turns out to be 1% in this case). However, you should never make approximations, regardless of how good they are, without realising that you are making approximations and appreciate the impact such approximations have. In exams (and real life) you should always note what approximations you make and there are often marks for this.