Hmmm - convincing set of numbers here.
Lets say a project has $100 allocated to it with net profits of $10 and required rate of return of 10% so 100 = 10/0.1
OK.
If expected losses were to increase by z,
This is where I have trouble. If expected losses increase, its not a riskier project, just a worse project. Similarly if costs or tax increase, the project is just worse, and the sponsor would pay a lower price for that project and those cashflows.
A riskier project is a project that gives a HIGHER return, but has associated risks and volatility.
) net profit = Revenue - cost - expected losses
) project is more risky as a result and required return increases by y
New capital Aloocated = (10-z)/(0.1+y) which would certainly be less than $100 since both z and y are positive. So increased risk is implying that less capital should be allocated.
Where have I gone wrong?
My numbers would be that the new project has higher risk, but will generate 12 of profits on average. If the higher risk, means that the required rate of return according to CAPM is 12%, rather than 10%, then the capital required is 12/0.12 = 100m as before.
the other method is to model one standard deviation of profits, and call that the EAR. Then the capital required is EAR / WACC or EAR / Risk-free (different companies choose different things. So long as they are applied equally across th institution, they add something to the process). Then a riskier project would result in an increased EAR and hence more capital.