18  Risk

For more information on these topics, see Allen, Doherty, Weigelt, and Mansfield Chapter 14: Risk Analysis.

This chapter introduces fundamental concepts for analyzing risk and uncertainty in managerial decision making. The key ideas:

18.1 Probability and Expected Value

Managers often need to make decisions in uncertain environments where the outcomes are not known with certainty. Probability provides a way to quantify the likelihood of different possible outcomes. The probability of an event is defined as the proportion of times it would occur if a situation was repeated many times under similar conditions.

For example, if a company launches a new product, there may be a 60% chance it succeeds and a 40% chance it fails based on market research and historical data. We can represent this as:

Outcomes Probabilities
Success 0.6
Failure 0.4

Expected value is a key concept that combines probabilities with the monetary payoffs of different outcomes. It is calculated by multiplying each possible outcome value by its probability and summing the results. This gives the average outcome if the situation was repeated many times. For the product launch example, let’s say success leads to a $1 million profit while failure results in a $500,000 loss:

Outcomes Probabilities Payoffs
Success 0.6 $1M
Failure 0.4 -$500K

The expected value of launching a product is the payoff for a success times the probability a success happens, plus the payoff for a failure times the probability a failure happens:

\[1\text{M} \times 0.6 - 500\text{K} \times 0.4 = 400\text{K}\] The expected value is $400K, meaning on average the company would expect to gain $400K from launching this product if they did it many times.

Practice Question 1: A company is considering launching a new software product. Market research indicates there’s a 70% chance of success, which would result in a profit of $800K. If the product fails, the company would lose $300K. What is the expected value of launching this product?






18.2 Decision Trees

Decision trees provide a visual way to map out complex decision problems involving multiple choices and uncertain outcomes. They use squares to represent decision nodes where the decision maker chooses between options, and circles to represent chance nodes where probability determines the outcome.

Take this decision tree for example. The Jones Corporation is considering a price increase (the blue square represents that choice). If they don’t increase the price (lower branch), they can continue on what they’re doing and earn an expected profit of $200K. If they increase the price, their fate is in the customer’s hands (black circle). Customers either accept the price increase, in which case they earn $800K in profit, or customers don’t accept it, and they lose $600K. If customers accept it and don’t accept it with equal probability, then the expected profit from increasing the price is \(0.5 \times 800\text{K} - 0.5 \times 600\text{K} = 100\text{K}\), which is why the upper branch is labelled $100K.

The upper branch has an expected value of $100K and the lower one has an expected value of $200K: if the firm wants to maximize expected profit, it should choose the lower branch and not increase prices. We draw two vertical lines through the branch that is eliminated.

18.3 Value of Information

In many situations, managers have the option to gather additional information before making a decision, but this information often comes at a cost. The concept of the expected value of perfect information (EVPI) helps determine how much a decision maker should be willing to pay for information that eliminates uncertainty.

EVPI is calculated as the difference between:

The expected value with perfect information - The expected value of the best decision without additional information

For example, imagine a company is deciding whether to expand into a new market. They estimate a 60% chance of high demand (leading to $10 million profit) and a 40% chance of low demand (leading to $2 million loss). Without any additional information, the expected value is:

\[0.6 \times 10 \text{M} - 0.4 \times 2\text{M} = \$ 5.2 \text{M}\]

Now imagine they could pay for a perfect market research study that would tell them with certainty whether demand will be high or low before deciding to expand. With this perfect information, they would expand if demand is high and not expand if demand is low. The expected value with perfect information is:

\[0.6 \times 10\text{M} - 0.4 \times 0\text{M} = \$ 6 \text{M}\]

The EVPI is the difference:

\[6\text{M} - 5.2\text{M} = \$800\text{K}\]

This means the company should be willing to pay up to $800,000 for perfect information about market demand before making their expansion decision.

Practice Question 2: A company is considering launching a new product. Market research suggests there’s a 70% chance of high demand, resulting in a $5 million profit, and a 30% chance of low demand, resulting in a $1 million loss. The company has the option to conduct additional market research that would provide perfect information about the demand before deciding whether to launch. What is the Expected Value of Perfect Information (EVPI) for this decision?






18.4 Utility Functions and Risk Preferences

While expected monetary value is useful, it doesn’t capture how different individuals or organizations may feel about risk. Utility functions provide a way to represent risk preferences mathematically. They map monetary outcomes to “utility” values that represent the subjective value or satisfaction an individual derives from different levels of wealth. There are three main types of risk preferences:

  • Risk-averse: Prefer more certain outcomes. Utility increases at a decreasing rate as wealth increases.
  • Risk-neutral: Care only about expected value, not risk. Utility increases linearly with wealth.
  • Risk-seeking: Prefer riskier outcomes. Utility increases at an increasing rate as wealth increases.

We can model these different preferences with ggplot:

library(tidyverse)

ggplot() +
  xlab("Wealth") +
  ylab("Utility") +
  stat_function(fun = function(x) 22.4 * sqrt(x), aes(color = "Risk Averse")) +
  stat_function(fun = function(x) x, aes(color = "Risk Neutral")) +
  stat_function(fun = function(x) x^2 / 500, aes(color = "Risk Seeking")) +
  xlim(0, 1000) +
  ylim(0, 1000)

To illustrate how risk preferences affect decisions, let’s consider a simple gamble:

  • 50% chance to win $10,000;
  • 50% chance to lose $5,000.

The expected value is:

\[0.5 \times 10\text{K} - .5 \times 5\text{K} = \$ 2.5 \text{K}\]

A risk-neutral decision maker would always take this gamble because the expected value is positive. However, a risk-averse decision maker might reject it due to the possibility of losing money, even though the expected value is positive.

If the risk-averse decisionmaker has a utility function of \(u(x) = \sqrt{x}\) and an initial wealth of $5,500, compare their expected utility to take the gamble or to not take the gamble.

Not taking the gamble:

\[u(5.5\text{K}) = \sqrt{5500} = 74.16\] Gambling:

They have a 50% chance to win $10K and a 50% chance to lose $5K, so with probability 0.5, they end up with $15,500 and with probability 0.5, they end up with $500.

\[0.5 \times u(15.5K) + 0.5 \times u(500) = 0.5 \sqrt{15500} + 0.5 \sqrt{500}\]

0.5 * sqrt(15500) + 0.5 * sqrt(500)
[1] 73.42984

The risk-averse person gets a higher utility from keeping their $5,500 instead of gambling it, even though the gamble has an expected value greater than 0.

Practice Question 3: Which of the following best describes a risk-averse individual’s utility function?






Practice Question 4: A person with initial wealth of $10,000 is offered a gamble with a 50% chance to win $5,000 and a 50% chance to lose $4,000. If this person decides not to take the gamble, what type of risk preference are they most likely exhibiting?






Practice Question 5: Consider a utility function u(x) = x² / 1000, where x is wealth in dollars. Which type of risk preference does this utility function represent?







18.5 Classwork 18: Risk Analysis

  1. Post-Graduation Decision Problem: To turn in the drawings in this problem, please take pictures and upload. Maria is graduating with an economics degree facing two initial options:

    • Accept a consulting job offer ($50,000/year guaranteed), or
    • Take a year off to apply to graduate programs (outcome uncertain)
    1. Draw a simple decision tree with these two branches. Represent Maria’s decision node with a square. What’s the expected value of the consulting path? (This is your baseline for comparison)

    2. If Maria takes a year off to apply to graduate programs, there’s a 70% chance of acceptance to at least one program, and a 30% chance of rejection from all programs. If rejected, she can still take the consulting job after a year ($50,000/year). Update your decision tree.

    3. If accepted, Maria must choose between:

      • Master’s (2 years): costs $60,000 total. Then, she has an 80% chance of getting a finance job ($90,000/year), and a 20% chance of getting a government job ($60,000/year).
      • PhD (5 years): receives $30,000/year stipend. Then, a 35% chance of tenure-track job ($120,000/year), and a 65% chance of industry research ($100,000/year).

      Update your decision tree and calculate the total expected income for the first 10 years for each branch. Show that taking the consulting job branch gives Maria $500K for the first 10 years, taking a year off while applying to grad school and then not getting in leads to $450K for the first 10 years, getting a Master’s gives her an expected $528K for the first 10 years, and getting a PhD gives her an expected $578K for the first 10 years.

    4. If Maria is accepted into grad school and is trying to maximize the sum of her expected income for the first 10 years after college, should she pursue a Master’s or a PhD? Eliminate the other branch by drawing 2 small vertical lines through it.

    5. Now you can calculate Maria’s expected 10-year income for the initial branch “taking a year off to apply to grad schools”. Show that this value is $539.6K. Should Maria take a year off to apply to grad schools, or should she take the consulting job right out of college?

    6. Consider another student Sophia. Sophia is facing the same decision as Maria, and has all the same numbers on her decision tree, except that Sophia has a smaller chance of getting into graduate schools if she applies. How small do Sophia’s chances have to be to justify taking the consulting job instead of applying to graduate school? (Assume Sophia is also simply trying to maximize the sum of her income for the first 10 years)

  2. Value of Information in Market Research: A company is considering entering a new market. They estimate a 65% chance of high demand (profit of $8M) and a 35% chance of low demand (loss of $3M). They can conduct market research that will give them perfect information about the demand, but it costs $1.5M.

    1. Calculate the expected value without additional information.

    2. Calculate the expected value with perfect information.

    3. Determine the Expected Value of Perfect Information (EVPI).

    4. Explain why the company should not conduct the market research.

    5. What’s the maximum amount the company should be willing to pay for this perfect information?

  3. Risk Preferences and Utility Functions: Consider three investors with different risk preferences. Their utility functions are:

    • Risk-averse: \(U(x) = \ln(x)\)
    • Risk-neutral: \(U(x) = x\)
    • Risk-seeking: \(U(x) = x^{1.5}\)

    Where x is wealth in thousands of dollars. They each have an initial wealth of $50,000 and are offered a gamble: 50% chance to win $30,000 and 50% chance to lose $20,000.

    1. Calculate the expected monetary value of the gamble.

    2. For each investor, calculate the utility of not taking the gamble, and the utility of taking the gamble.

    3. Show that the risk-neutral and risk-seeking investors take the gamble, but the risk-averse investor does not.

    4. Show that the certainty equivalent of the gamble for the risk-averse investor is around $49K. The certainty equivalent is the amount of money that would give the investor the same utility as the expected utility of the gamble - in other words, it’s the amount of money that would make them indifferent between taking the gamble and having that amount with certainty.