Decision Theory Study Notes
Introduction
Definition: Decision theory is a generalized approach to managerial decision-making,
focusing on available actions and their consequences under varying degrees of certainty.
Objective: Optimize organizational goals (e.g., maximize profit, minimize cost, time, or
waste).
Key Factor: Degree of certainty about future outcomes, ranging from complete certainty
to complete uncertainty, with risk in between.
Characteristics
1. Alternatives: Mutually exclusive and collectively exhaustive decisions available to the
decision-maker (e.g., "do nothing" may be an option).
2. States of Nature: Future conditions or events beyond the decision-maker’s control, also
mutually exclusive and collectively exhaustive.
3. Payoffs: Values (e.g., profits, revenues, costs) associated with each alternative/state of
nature combination. Accurate estimates improve decision quality.
4. Degree of Certainty: Ranges from complete certainty (known outcomes) to complete
uncertainty (unknown probabilities) to risk (known probabilities).
5. Decision Criteria: Rules reflecting the decision-maker’s attitude (optimistic, pessimistic)
and certainty level to select the best alternative.
Payoff Table
Structure: Summarizes alternatives, states of nature, and payoffs (Vij) for each
combination. Probabilities may be included if known.
Format:
State of Nature
S1 S2 S3
A1 V11 V12 V13
A2 V21 V22 V23
A3 V31 V32 V33
where Ai = ith alternative, Sj = jth state of nature, Vij = payoff for Ai under Sj.
Decision-Making Process
1. Identify all possible states of nature.
2. List all feasible alternatives.
3. Determine payoffs for each alternative/state of nature combination.
4. Choose the best alternative based on the decision criterion and payoffs.
Decision-Making Environments
1. Certainty: One state of nature per alternative; choose the alternative with the optimal
payoff.
o Example: Payoff table with profits (S1, S2, S3):
o A1: 4, 16, 12
o A2: 5, 6, 10
o A3: -1, 4, 15
If S2 is certain, choose A1 (16, highest profit).
2. Uncertainty: Multiple states of nature, no probabilities known. Approaches include:
o Maximax (Optimistic): Select the alternative with the maximum of the
maximum payoffs.
Example: A1 (16), A2 (10), A3 (15) → Choose A1 (16).
o For costs: Select minimum of minimum costs (minimin).
o Maximin (Pessimistic): Select the alternative with the maximum of the minimum
payoffs.
Example: A1 (4), A2 (5), A3 (-1) → Choose A2 (5).
For costs: Select minimum of maximum costs.
o Minimax Regret: Use an opportunity loss table (difference from best payoff in
each state). Choose the alternative with the minimum of maximum regrets.
Example:
Opportunity Loss:
A1: 1, 0, 3 → 3
A2: 0, 10, 5 → 10
A3: 6, 12, 0 → 12
Choose A1 (minimum regret = 3).
o Laplace (Insufficient Reason): Assume equal probability for all states, select the
alternative with the highest average payoff.
Example: A1 (23.2), A2 (9.6), A3 (9.6) → Choose A1.
o Hurwitz (Realism): Weigh maximum payoff by optimism coefficient (α, 0<α<1)
and minimum by (1-α). Choose the highest weighted value.
Example (α=0.4): A1 (8.8), A2 (7), A3 (5.6) → Choose A1.
Limitation: α is subjective.
3. Risk: Known probabilities for states of nature. Approaches include:
o Expected Monetary Value (EMV): Calculate weighted average payoff (EMVi =
Σ(Pj × Vij)). Choose the highest EMV.
Example (P(S1)=0.2, P(S2)=0.5, P(S3)=0.3):
A1: 0.2×4 + 0.5×16 + 0.3×12 = 12.4
A2: 0.2×5 + 0.5×6 + 0.3×10 = 7
A3: 0.2×(-1) + 0.5×4 + 0.3×15 = 6.3
Choose A1 (12.4).
o Expected Opportunity Loss (EOL): Calculate weighted opportunity losses.
Choose the minimum EOL.
Example: A1 (1.1), A2 (6.5), A3 (7.2) → Choose A1.
Note: EOL and EMV yield the same decision.
o Expected Value of Perfect Information (EVPI): Difference between expected
payoff under certainty (EPC) and EMV.
Example: EPC = 0.2×5 + 0.5×16 + 0.3×15 = 13.5; EVPI = 13.5 - 12.4 =
1.1 (equals EOL).
4. Conflict: Not covered in detail, involves competitive scenarios (e.g., game theory).
Decision Trees
Definition: Graphical representation of decisions, states of nature, probabilities, and
payoffs, read from right to left.
Components:
o Decision Node: Square, represents choices.
o State of Nature Node: Circle, represents events with probabilities.
o Branches: Lines from squares (alternatives) or circles (states of nature).
Steps:
1. Identify decision points and alternatives.
2. Assign probabilities and payoffs to each branch.
3. Compute EMV at each probability node, starting from the right.
4. Choose the alternative with the highest EMV at each decision node.
5. Work backward to the initial decision point.
6. Identify the optimal strategy across all outcomes.
Example (Real Estate Investment):
o Alternatives: Apartment, Office, Warehouse.
o States: Good (0.6), Poor (0.4) economic conditions.
o Payoffs:
o Apartment: 50,000, 30,000
o Office: 100,000, -40,000
o Warehouse: 30,000, 10,000
o EMV: Apartment (42,000), Office (44,000), Warehouse (22,000).
o Decision: Choose Office (highest EMV = 44,000).
o For multi-stage decisions (e.g., 5-year span), extend the tree for each stage.