Consider the following two choices: 1. You have to choose between losing ($ 7400) for sure, and

Question:

Consider the following two choices:

1. You have to choose between losing \(\$ 7400\) for sure, and a risky alternative, whereby you lose \(\$ 10,000\) with probability 0.75 and nothing with probability 0.25 . What do you prefer?

2. You may take a gamble whereby you win \(\$ 7400\) with probability 0.25 , and you lose \(\$ 2600\) with probability 0.75 . Do you accept?

Most people take the risky alternative in the first case, but they do not in the second one. Note that, in the second case, the expected payoff of the gamble is

image text in transcribed

Since we expect to lose \(\$ 100\), any risk-averse or risk-neutral decision maker would not take the chance. However, in the first case, the expected loss of the risky alternative is \(\$ 7500\), which is larger than the sure loss by \(\$ 100\). If we consider the sure loss of \(\$ 7400\) as a sunk cost, taking chances in the first choice is equivalent to adding the gamble of the second choice to the sure loss. Hence, it seems that, when loss is involved, we may behave as risk-lovers, contradicting the idea of concave utility functions.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: