In the constant-growth model we can apply the equation that P = D ÷ (r-g) only under the assumption that r > g. Suppose someone tries to argue with you that for a certain stock, r < g forever, not just during a temporary growth spurt. Why can’t this be the case? What would happen to the stock price if this were true? If you try to answer simply by looking at the formula you will almost certainly get the wrong answer. Think it through.