Anda di halaman 1dari 10

Summary of Chapter 6 : Decision Trees and Influence Diagram

(Goodwin, 2004, 3rd Edition)


Zulkifli
29115563
Decision Trees and Influence Diagram
Definition
Decision problems are multi-stage in character when the choice of a given option may
result in circumstances which will require yet another decision to be made. Decision trees can
serve a number of purposes when complex multi-stage problems are encountered.
They can help a decision maker to develop a clear view of the structure of a problem
and make it easier to determine the possible scenarios which can result if a particular course
of action is chosen. Influence diagrams offer an alternative way of structuring a complex
decision problem and some analysts find that people relate to them much more easily.
Constructing a Decision Tree

1.
2.

A square is used to represent a decision node.


A circle, on the other hand, is used to represent a chance node.

The branches emanating from a circle are therefore labeled with probabilities which
represent the decision makers estimate of the probability that a particular branch will be
followed. Example :

Determining the Optimal Policy


Decision tree consists of a set of policies. A policy is a plan of action stating which
option is to be chosen at each decision node that might be reached under that policy. The
technique for determining the optimal policy in a decision tree is known as the rollback
method. To apply this method, we analyze the tree from right to left by considering the later
decisions first.

It can be seen that the rollback method allows a complex decision problem to be
analyzed as a series of smaller decision problems. Judgment is therefore needed to determine
where the tree should end. Clearly, the calculations involved in analyzing a large decision tree
can be rather tedious.
Decision Trees and Utility
In the previous section we made the assumption that the decision maker was neutral to
risk. Let us now suppose that the engineer is concerned that his career prospects will be

blighted if the development of the processor leads to a great loss of money for the company.
The figure shows the decision tree, with the utilities replacing the monetary values. After
applying the rollback method it can be seen that now the optimum policy is to develop the
electric-powered design and, if it fails, to abandon the project.

Phase of Decision Analysis

What determines the decision analysts provisional representation of the decision


problem? Generally, it will be based upon past experience with similar classes of decision
problems and, to a significant extent, intuition. Keeney said A simple decision tree
emphasizing the problem structure which illustrates the main alternatives, uncertainties, and
consequences, can usually be drawn up in a day. Not only does this often help in defining the
problem, but it promotes client and colleague confidence that perhaps decision analysis can
help.
Electing Decision Tree Representation

Influence diagrams are designed to summarize the dependencies that are seen to exist
among events and acts within a decision. Given certain conditions, influence diagrams can be
converted to trees. The advantage of starting with influence diagrams is that their graphic
representation is more appealing to the intuition of decision makers who may be unfamiliar
with decision technologies. In addition, influence diagrams are more easily revised and
altered as the decision maker iterates with the decision analyst. The definition used in
influenced diagrams :

The figure shows the key concept, As with the decision tree, event nodes are
represented by circles and decision nodes by squares. Arrowed lines between nodes indicate
the influence of one node to another. Arrow pointing to a decision node indicates that either
the decision is influenced by a prior decision or on the occurrence (or not) of prior events.
The step procedure for turning an influenced diagrams into the decision tree is as follows :
1.

Identify a node with no arrows pointing into it (since there can be no loops at least

one node will be such).


2. If there is a choice between a decision node and an event node, choose the decision
node.
3. Place the node at the beginning of the tree and remove the node from the influence
diagram.
4. For the now-reduced diagram, choose another node with no arrows pointing into it. If
there is a choice a decision node should be chosen.
5. Place this node next in the tree and remove it from the influence diagram.

6. Repeat the above procedure until all the nodes have been removed from the influence
diagram.

Summary of Chapter 8 : Revising Judgments in The Light of New


Information (Goodwin, 2004, 3rd Edition)
Zulkifli
29115563
Revising Judgments in The Light of New Information
This chapter is about the process of revising initial probability estimates in the light of
new information. The focus of our discussion will be Bayes theorem, which is named after
an English clergyman, Thomas Bayes, whose ideas were published posthumously in 1763.
Bayes theorem will be used as a normative tool, telling us how we should revise our
probability assessments when new information becomes available.
Bayes Theorem
In Bayes theorem an initial probability estimate is known as a prior probability.
When Bayes theorem is used to modify a prior probability in the light of new information the
result is known as a posterior probability. This chapter will attempt to develop the idea
intuitively and then show how a probability tree can be used to revise prior probabilities. This

theory will attempt to develop the idea intuitively and then show how a probability tree can
be used to revise prior probabilities.

The steps to summarize is as follows :


1. Construct a tree with branches representing all the possible events which can occur and
write the prior probabilities for these events on the branches.
2. Extend the tree by attaching to each branch a new branch which represents the new
information which you have obtained. On each branch write the conditional probability
of obtaining this information given the circumstance represented by the preceding
branch.
3. Obtain the joint probabilities by multiplying each prior probability by the conditional
probability which follows it on the tree.
4. Sum the joint probabilities.
5. Divide the appropriate joint probability by the sum of the joint probabilities to obtain
the required posterior probability.
The effect of new information on the revision of probability judgments
Figure 8.5 shows the probability tree and the calculation of the posterior probabilities. It
can be seen that these probabilities are identical to the probabilities of the test drilling giving
a correct or misleading result. In other words, the posterior probabilities depend only upon
the reliability of the new information. The vague prior probabilities have had no influence
on the result.

A more general view of the relationship between the vagueness of the prior
probabilities and the reliability of the new information can be seen in Figure 8.6. In the figure
the horizontal axis shows the prior probability that gas will be found, while the vertical axis
represents the posterior probability when the test drilling has indicated that gas will be found.
For example, if the prior probability is 0.4 and the result of the test drilling is 70% reliable
then the posterior probability will be about 0.61. The graph shows that if the test drilling has
only a 50% probability of giving a correct result then its result will not be of any interest and
the posterior probability will equal the prior, as shown by the diagonal line on the graph.

Applying Bayes theorem to a decision problem


A retailer has to decide whether to hold a large or a small stock of a product for the
coming summer season. A payoff table for the courses of action and outcomes is shown
below:
(Profits)
Decision
Hold small stocks

Low sales

High sales

$80 000

$140 000

Hold large stocks


$20 000
$220 000
The following table shows the retailers utilities for the above sums of money :
Profit
$20 000
$80 000
$140 000
$220 000
Utility
0
0.5
0.8
1.0
The retailer estimates that there is a 0.4 probability that sales will be low and a 0.6
probability that they will be high. What level of stocks should he hold? A decision tree for the
retailers problem is shown in Figure 8.7(a). It can be seen that his expected utility is
maximized if he decides to hold a small stock of the commodity.
We can use Bayes theorem to modify the retailers prior probabilities in the light of the
new information. Figure 8.7(b) shows the probability tree and the appropriate calculations. It
can be seen that the posterior probabilities of low and high sales are 0.15 and 0.85,

respectively. These probabilities replace the prior probabilities in the decision tree, as shown
in Figure 8.7(c).

The expected value of imperfect information


A summary of the main stages in the above analysis is given below:
(1) Determine the course of action which would be chosen using only the prior probabilities
and record the expected payoff of this course of action;
(2) Identifythepossible indications which thenew informationcan give;
(3) For each indication:
(a) Determine the probability that this indication will occur;
(b) Use Bayes theorem to revise the probabilities in the light of this indication;
(c) Determine the best course of action in the light of this indication (i.e. using the
posterior probabilities) and the expected payoff of this course of action;
(4) Multiply the probability of each indication occurring by the expected payoff of the course
of action which should be taken if that indication occurs and sum the resulting products. This

will give the expected payoff with imperfect information;


(5) The expected value of the imperfect information is equal to the expected payoff with
imperfect information (derived in stage 4) less the expected payoff of the course of action
which would be selected using the prior probabilities (which was derived in stage 1).

Summary
In this chapter we have discussed the role that new information can play in revising the
judgments of a decision maker. We argued that Bayes theorem shows the decision maker
how his or her judgments should be modified in the light of new information, and we showed
that this revision will depend both upon the vagueness of the prior judgment and the
reliability of the new information. Of course, receiving information is often a sequential
process. Your prior probability will reflect the information you have received up to the point
in time when you make your initial probability assessment. As each new instalment of
information arrives, you may continue to revise your probability. The posterior probability
you had at the end of last week may be treated as the prior probability this week, and be
revised in the light of this weeks information.
We also looked at how the value of new information can be assessed. The expected value
of perfect information was shown to be a useful measure of the maximum amount that it
would be worth paying for information. Calculating the expected value of imperfect
information was seen to be a more involved process, because the decision maker also has to
judge the reliability of the information. Because of this, we stressed the importance of
sensitivity analysis, which allows the decision maker to study the effect of changes in these
assessments.

Anda mungkin juga menyukai