[go: up one dir, main page]

0% found this document useful (0 votes)
80 views7 pages

Decision Tree: - Square Nodes: Denote Decision Points - Circular Nodes: Denote Chance Events

- A decision tree is a schematic representation of alternatives and their consequences in a tree-like diagram. It contains decision and chance nodes. - The expected value of perfect information is the difference between the expected payoff with complete information and the expected payoff with risk. It can be calculated in two ways. - Sensitivity analysis determines the range of probabilities over which the best alternative remains the same. It involves constructing a graph and using algebraic equations of slopes and alternatives.

Uploaded by

AMT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views7 pages

Decision Tree: - Square Nodes: Denote Decision Points - Circular Nodes: Denote Chance Events

- A decision tree is a schematic representation of alternatives and their consequences in a tree-like diagram. It contains decision and chance nodes. - The expected value of perfect information is the difference between the expected payoff with complete information and the expected payoff with risk. It can be calculated in two ways. - Sensitivity analysis determines the range of probabilities over which the best alternative remains the same. It involves constructing a graph and using algebraic equations of slopes and alternatives.

Uploaded by

AMT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 7

DECISION TREE

• a schematic representation of the alternatives available and their possible consequences.


• a treelike appearance of the diagram
• useful for analyzing situations that involve sequential decisions
A decision tree is composed of a number of nodes that have branches emanating from
them.
• Square Nodes : denote decision points
• Circular Nodes : denote chance events
• Read the tree from left to right.
• Analyzed the tree from right to left. - starting from the last decision that might be made.
EXPECTED VALUE OF PERFECT INFORMATION (EVPI)

• The difference between the expected payoff with perfect information and the expected payoff under
risk.

• 2 ways to determine the EVPI.


• Compute the expected payoff under certainty and subtract the expected payoff under risk.
EVPI = expected payoff - expected payoff
under certainty under risk

• Use the regret table to compute the EVPI. Find the expected regret for each alternative. The
minimum the expected regret is equal to the EVPI.
EXAMPLE:
Low Moderate High
.30 .50 .20

• Best payoff under low demand is P10, moderate demand P12 and high demand P16.
• The expected payoff under certainty:
.30(P10) + .50(P12) + .20(P16) = P 12.2
• The expected payoff under risk:
.30(P7) + .50(P12) + .20(P12) = P 10.5
• Expected value of perfect information:
EVPI = P 12.2 – P10.5
EVPI = P 1.7
EXAMPLE
Alternatives Low Moderate High Low Moderate High
A P0 P2 P6 .30 .50 .20
B 3 0 4
C 14 10 0

A = .30(0) + .50(2) + .20(6)


= 2.2

B = .30(3) + .50(0) + .20(4)


= 1.7 (MINIMUM)

C = .30(14) + .50(10) + .20(0)


= 9.2
SENSITIVITY ANALYSIS

• Determining the range of probability over which the choice of


alternatives would remain the same.

• It involves constructing a graph and using algebra.


#1 #2

EXAMPLE
Alternative A 4 12
B 16 2
C 12 8

A
C

B Best C Best A Best


#1 #2 Slope Equation
A 4 12 12 – 4 = +8 4 + 8P(2)
B 6 2 2 – 16 = -14 16 – 14P(2)
C 12 8 8 – 12 = -4 12 – 4P(2)

You might also like