[go: up one dir, main page]

0% found this document useful (0 votes)
7 views22 pages

Probabilistic Reasoning

Chapter 14 discusses probabilistic reasoning and the construction of Bayesian networks to model uncertainty using probability theory. It explains the syntax and semantics of Bayesian networks, highlighting their graphical representation of conditional independence and the use of conditional probability tables. The chapter also contrasts Bayesian networks with other approaches to uncertain reasoning and outlines the challenges in assessing conditional probabilities.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views22 pages

Probabilistic Reasoning

Chapter 14 discusses probabilistic reasoning and the construction of Bayesian networks to model uncertainty using probability theory. It explains the syntax and semantics of Bayesian networks, highlighting their graphical representation of conditional independence and the use of conditional probability tables. The chapter also contrasts Bayesian networks with other approaches to uncertain reasoning and outlines the challenges in assessing conditional probabilities.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Chapter-14

Probabilistic Reasoning
Introduction
•How to build network models to reason under
uncertainty according to the laws of probability
theory
•Bayesian Networks
-belief network/probabilistic network/causal
network/knowledge map
-define the syntax & semantics of these
networks & show how they can be used to
capture uncertain knowledge in a natural &
efficient way.
Bayesian networks
•A simple, graphical notation for conditional
independence assertions & hence for compact
specification of full joint distributions
•A Bayesian network is a directed graph in which
each node is annotated with quantitative
probability information.
Syntax:
•a set of nodes, one per variable
•A set of directed links/arrows connects pairs of nodes. If there is
an arrow from node X to node Y, X is said to be a parent of Y
•a directed, acyclic graph (link “directly influences")
•a conditional distribution for each node given its parents:
P(Xi|Parents(Xi))
•In the simplest case, conditional distribution represented as a
conditional probability table (CPT) giving the distribution over Xi
for each combination of parent values
Example

Cavity is a direct cause of Toothache & Catch, whereas no direct


causal relationship exists between Toothache & Catch
Another Example
•You have a new burglar alarm
installed at home. It is fairly reliable at
detecting a burglary, but also responds
on occasion to minor earthquakes.
•You also have 02 neighbors, John &
Mary, who have promised to call you at
work when they hear the alarm.
•John always call hears the alarm, but
sometimes confuses the telephone
ringing with the alarm & calls then,
too.
•Mary, on the other hand, likes rather
loud music & sometimes misses the
alarm altogether.
•Given the evidence who has or has
not called,
-estimate the probability of a burglary
Example contd.
Semantics of Bayesian Networks
•Two ways:
1. See the network as a representation of the joint
probability distribution
-helpful in understanding how to construct
networks
2. View the network as an encoding of a collection
of conditional independence statements
-helpful in designing inference procedures
Compactness
Global semantics
Local semantics
•each node is conditionally independent of its
non-descendants given its parents

A node X is conditionally
independent of its non-
descendants (ZijS) given
its parents (the Uis shown
in the gray area)

Johncalls is independent of Burglary & Earthquake


Markov Blanket
•Each node is conditionally independent of all
others given its
Markov blanket: parents + children + children's
parents
Burglary is independent of
JohnCalls & MaryCalls,
given Alarm & Earthquake
Constructing Bayesian networks
Example

No parents
Example
Example
Example
Example
Example

•Deciding conditional independence is hard in non-causal directions


(Causal models and conditional independence seem hardwired for humans!)
•Assessing conditional probabilities is hard in non-causal directions
-Network is less compact: 1 + 2 + 4 + 2 + 4=13 numbers needed
Other approaches to uncertain
reasoning
Different generations of expert systems
 Strict logic reasoning (ignore uncertainty)
 Probabilistic techniques using the full Joint
 Default reasoning - believed until a better reason
is found to believe something else
 Rules with certainty factors
 Handling ignorance - Dempster-Shafer theory
 Vagueness - something is sort of true (fuzzy logic)
Probability makes the same ontological
commitment as logic: the event is true or
false
20
Rule-based methods
Logical reasoning systems have properties
like:
 Monotonicity
 Locality
 Detachment
 Truth-functionality
These properties are good for obvious
computational advantages; bad as they’re
inappropriate for uncertain reasoning.

21
Summary
•Bayesian Networks
•Compactness
•Constructing of Bayesian Networks

You might also like