[go: up one dir, main page]

0% found this document useful (0 votes)
425 views4 pages

Alarm Problem

The document explains Bayesian networks using a burglary alarm example, illustrating how to compute the probability of various events like alarm sounds and neighbor calls based on conditional probabilities. It describes the structure of the network, the events involved, and provides conditional probability tables for alarm, David's calls, and Sophia's calls. Additionally, it discusses the importance of temporal models in AI for handling uncertainty over time in dynamic systems.

Uploaded by

onemistry513
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
425 views4 pages

Alarm Problem

The document explains Bayesian networks using a burglary alarm example, illustrating how to compute the probability of various events like alarm sounds and neighbor calls based on conditional probabilities. It describes the structure of the network, the events involved, and provides conditional probability tables for alarm, David's calls, and Sophia's calls. Additionally, it discusses the importance of temporal models in AI for handling uncertainty over time in dynamic systems.

Uploaded by

onemistry513
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

lOMoAR cPSD| 21456926

Explanation of Bayesian network:


Let's understand the Bayesian network through an example by creating a directed acyclic
graph:

Example: Harry installed a new burglar alarm at his home to detect burglary. The alarm
reliably responds at detecting a burglary but also responds for minor earthquakes. Harry has
two neighbors David and Sophia, who have taken a responsibility to inform Harry at work
when they hear the alarm. David always calls Harry when he hears the alarm, but sometimes
he got confused with the phone ringing and calls at that time too. On the other hand, Sophia
likes to listen to high music, so sometimes she misses to hear the alarm. Here we would like
to compute the probability of Burglary Alarm.

Problem:

Calculate the probability that alarm has sounded, but there is neither a burglary, nor
an earthquake occurred, and David and Sophia both called the Harry.

Solution:

• The Bayesian network for the above problem is given below. The network structure is
showing that burglary and earthquake is the parent node of the alarm and directly
affecting the probability of alarm's going off, but David and Sophia's calls depend on
alarm probability.
• The network is representing that our assumptions do not directly perceive the burglary
and also do not notice the minor earthquake, and they also not confer before calling.
• The conditional distributions for each node are given as conditional probabilities table
or CPT.
• Each row in the CPT must be sum to 1 because all the entries in the table represent an
exhaustive set of cases for the variable.
• In CPT, a boolean variable with k boolean parents contains 2K probabilities. Hence, if
there are two parents, then CPT will contain 4 probability values

List of all events occurring in this network:

• Burglary (B)
• Earthquake(E)
• Alarm(A)
• David Calls(D)
• Sophia calls(S)
lOMoAR cPSD| 21456926

We can write the events of problem statement in the form of probability: P[D, S, A, B, E],
can rewrite the above probability statement using joint probability distribution:

P[D, S, A, B, E]= P[D | S, A, B, E]. P[S, A, B, E]

=P[D | S, A, B, E]. P[S | A, B, E]. P[A, B, E]

= P [D| A]. P [ S| A, B, E]. P[ A, B, E]

= P[D | A]. P[ S | A]. P[A| B, E]. P[B, E]

= P[D | A ]. P[S | A]. P[A| B, E]. P[B |E]. P[E]

Let's take the observed probability for the Burglary and earthquake component:
P(B= True) = 0.002, which is the probability of burglary.
P(B= False)= 0.998, which is the probability of no burglary.
P(E= True)= 0.001, which is the probability of a minor earthquake
P(E= False)= 0.999, Which is the probability that an earthquake not occurred.

We can provide the conditional probabilities as per the below tables:


lOMoAR cPSD| 21456926

Conditional probability table for Alarm A:

The Conditional probability of Alarm A depends on Burglar and earthquake:


B E P(A= True) P(A= False)
True True 0.94 0.06
True False 0.95 0.04
False True 0.31 0.69
False False 0.001 0.999
Conditional probability table for David Calls:

The Conditional probability of David that he will call depends on the probability of Alarm.

A P(D= True) P(D= False)


True 0.91 0.09
False 0.05 0.95

Conditional probability table for Sophia Calls:

The Conditional probability of Sophia that she calls is depending on its Parent Node "Alarm."

A P(S= True) P(S= False)


True 0.75 0.25
False 0.02 0.98

From the formula of joint distribution, we can write the problem statement in the form of
probability distribution:

P(S, D, A, ¬B, ¬E) = P (S|A) *P (D|A)*P (A|¬B ^ ¬E) *P (¬B) *P (¬E).

= 0.75* 0.91* 0.001* 0.998*0.999

= 0.00068045.

Hence, a Bayesian network can answer any query about the domain by using Joint
distribution.

The Semantics of Bayesian Networks

Bayesian Networks (BNs) are graphical models that represent a set of variables and their
conditional dependencies via a directed acyclic graph (DAG). Each node represents a random
variable, and edges denote conditional dependencies. The semantics of BNs allow for the
computation of joint probability distributions efficiently by exploiting conditional
independencies.
lOMoAR cPSD| 21456926

There are two ways to understand the semantics of the Bayesian network, which is given
below:

1. To understand the network as the representation of the Joint probability

distribution.It is helpful to understand how to construct the network.

2. To understand the network as an encoding of a collection of conditional


independencestatements.

It is helpful in designing inference procedure.

Time and Uncertainty

Modeling temporal processes under uncertainty is essential in dynamic systems. Temporal


models, such as Hidden Markov Models (HMMs), are used to represent systems that evolve
over time with probabilistic transitions between states. These models are widely applied in
areas like speech recognition and bioinformatics.

In previous sections, we studied Bayesian Networks for representing uncertain knowledge


in static environments. However, in the real world, many processes are dynamic, meaning
they evolve over time. This introduces an additional layer of complexity: time-based
uncertainty.

For example, in the burglary alarm scenario, what if the alarm rings multiple times across
different time steps? Or, David and Sophia call on different days? To reason about such
problems, we require temporal models that handle both uncertainty and time.

Causes of Time-Based Uncertainty in Real Life:

1. Events occur in sequence, not all at once.


2. Sensor observations may be delayed or noisy.
3. State of the system can change unpredictably.
4. Actions have consequences that unfold over time.

Temporal Reasoning in AI

To handle this uncertainty over time, AI uses temporal probabilistic models.

These models allow:

 Predicting future outcomes based on current and past observations.


 Estimating the past state of the system using present and future data.
 Tracking a system's hidden state over time (such as whether a burglary is in
progress).

You might also like