[go: up one dir, main page]

0% found this document useful (0 votes)
11 views32 pages

Probabilistic Reasoning

The document discusses probabilistic reasoning, focusing on Bayes' Theorem, which updates the probability of an event based on new evidence. It also covers non-monotonic reasoning, allowing conclusions to change with new information, and various approaches like default logic and autoepistemic logic. Additionally, it explores advanced topics in fuzzy logic, including Type-2 fuzzy logic, fuzzy neural networks, and adaptive fuzzy systems.

Uploaded by

emilyroswell4947
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views32 pages

Probabilistic Reasoning

The document discusses probabilistic reasoning, focusing on Bayes' Theorem, which updates the probability of an event based on new evidence. It also covers non-monotonic reasoning, allowing conclusions to change with new information, and various approaches like default logic and autoepistemic logic. Additionally, it explores advanced topics in fuzzy logic, including Type-2 fuzzy logic, fuzzy neural networks, and adaptive fuzzy systems.

Uploaded by

emilyroswell4947
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 32

Probabilistic Reasoning

Probabilistic Reasoning
• In Fuzzy Logic, we handle degrees of truth (e.g., "Hot" is 0.7)
• If we need to handle uncertainty like,
👉 "What is the probability that a patient has a disease given some
symptoms?“
• Probabilistic Reasoning
• 🔹 Classical Logic: True or False (certainty).
🔹 Fuzzy Logic: Partial truth (e.g., 0.7 hot).
🔹 Probabilistic Reasoning: Likelihood of events
(e.g., 70% chance of flu).
Bayes' Theorem
• Definition and formula
• Bayes' theorem is a fundamental concept in probability theory that describes
the probability of an event based on prior knowledge of conditions that might
be related to the event
• It is named after the Reverend Thomas Bayes, who first formulated the
theorem
• The theorem is particularly useful in updating probabilities as new evidence
becomes available
Bayes' Theorem

P(A∣B) is the posterior probability of event A given evidence B.


P(B∣A) is the likelihood of evidence B given that event A has occurred.
P(A) is the prior probability of event A
P(B) is the probability of evidence B occurring
• In words, the formula states that the
probability of event A occurring given
evidence B is proportional to the likelihood
of observing evidence B given that event A
Formula has occurred, multiplied by the prior
probability of event A, and divided by the
probability of observing evidence B
regardless of whether event A has occurred
or not.
• Components of Naïve
• —prior, likelihood, and posterior
probabilities.

Types of • Prior Probability (P(A)):


Probabiliti • The prior probability represents your initial
belief or probability assigned to an event before
es in considering any new evidence.
Baye’s • In the context of Bayes' theorem, P(A) is the
probability of event A before observing any
evidence.
• It reflects what you believe about the probability
of A based on previous information or
knowledge.
Components
• Likelihood (P(B∣A))
• The likelihood is the probability of observing new evidence (B) given a
particular hypothesis or event (A).
• It describes how well the data supports a particular hypothesis.
• In the context of Bayes' theorem, P(B∣A) is the probability of observing
evidence B given that event A has occurred.
Components
• Posterior Probability (P(A∣B))
• The posterior probability is the updated probability of a hypothesis or event
(A) after taking into consideration the new evidence (B)
• It represents your revised belief based on the observed data.
• In the context of Bayes' theorem, P((A∣B) is the probability of event A given
the observed evidence B.
Example:
• A = "Person has COVID"
• B = "Person has a fever“

✅ P(A) = 0.05 (5% people have COVID)


✅ P(B | A) = 0.9 (90% COVID patients have fever)
✅ P(B) = 0.1 (10% of all people have fever)

Applying Bayes’ Rule 𝟎 . 𝟗 ×𝟎 . 𝟎𝟓


𝑷 ( 𝑨| 𝑩 ) = =𝟎 .𝟒𝟓
𝟎.𝟏
Bayesian Networks (BNs)
• A Bayesian Network (BN) is a graphical model that represents
dependencies between variables using nodes (variables) and edges
(dependencies)
• Example: Medical Diagnosis BN

Flu → Fever
→Cough

• If a person has Flu, it increases the chances of Fever & Cough


• If we observe Fever, we can update our belief about Flu using Bayes'
Theorem
Bayes in Python
import numpy as np
# Given probabilities
P_COVID = 0.05 # Prior: Probability of having COVID
P_FEVER_given_COVID = 0.9 # Likelihood: P(B | A)
P_FEVER = 0.1 # Evidence: P(B)

# Applying Bayes' Theorem


P_COVID_given_FEVER = (P_FEVER_given_COVID * P_COVID) / P_FEVER

print(f"Probability of COVID given Fever: {P_COVID_given_FEVER:.2f}")


# Output: 0.45
Non-Monotonic Reasoning
& Default Logic
Non-Monotonic Reasoning
• In classical logic, once something is true, it always remains true
• In real life, knowledge changes when new information arrives

👉 Non-Monotonic Reasoning allows conclusions to be revised based on


new evidence.
Example: Why Classical Logic Fails?
🚗 Example: "Birds can fly.“

✅ So, we conclude: Tweety (a bird) can fly.


❌ But then we learn: Tweety is a penguin.
✅ Now we must retract our previous conclusion!
Approaches to handle NMR
1
1️⃣ Default Logic – Use assumptions until proven wrong

2️⃣ Autoepistemic Logic – Model reasoning about


knowledge itself

3️⃣ Circumscription – Assume minimal change


when adding new facts
1. Default Logic (Most Practical)
👉 Uses assumptions that hold unless contradicted by new information

🔹 Proposed by Reiter (1980) for handling incomplete knowledge.


📌 Example: "Birds Can Fly" (Default Logic in Action)
Step 1: Initial Knowledge (Default Rule)
💡 Rule:
"If X is a bird, then assume X can fly (unless told otherwise)."
✅ Tweety is a bird → Assume Tweety can fly.
Step 2: New Information (Exception)
❌ But now, we learn: Tweety is a penguin.
👉 The default assumption (flying) is revoked, and a new fact is
accepted.
✅ Conclusion changes dynamically!
Default Logic Notation
A default rule is written as

𝑷𝒓𝒆𝒄𝒐𝒏𝒅𝒊𝒕𝒊𝒐𝒏: 𝑷 ( 𝑿)
𝑪𝒐𝒎𝒄𝒍𝒖𝒔𝒊𝒐𝒏 : 𝑸 ( 𝑿 ) (𝒖𝒏𝒍𝒆𝒔𝒔 𝒄𝒐𝒏𝒕𝒓𝒂𝒅𝒊𝒄𝒕𝒆𝒅)
For birds flying
𝑩𝒊𝒓𝒅 ( 𝑿 )
𝑪𝒂𝒏𝑭𝒍𝒚 ( 𝑿)

For Panguins
Penguin(X)⇒¬CanFly(X)
Autoepistemic Logic (Reasoning
About Knowledge)
👉 "I believe X to be true unless I learn otherwise."
🔹 Used for self-reasoning AI (e.g., knowledge-based agents).

📌 Example:

🤔 "I believe a train arrives at 10 AM."


🚆 If a new schedule says 11 AM, I update my belief.

🔹 This is useful for AI systems that reason about their own knowledge.
3. Circumscription (Minimal
Change)
👉 Assumes things stay the same unless explicitly changed.
🔹 Used in AI Planning, Diagnosis Systems.
📌 Example:
"People usually have two eyes."
If we meet a one-eyed person, we don’t assume all people have one eye
—just this case is special.
🔹 Mathematical Representation:
Minimize¬P(X)

This means: Assume as little change as possible.


Advanced Topics in
Fuzzy Logic
Type-2 Fuzzy Logic (Handling Higher
Uncertainty)
• Traditional Fuzzy Logic (Type-1) assumes exact membership values
(e.g., 0.6 "Warm")
• Type-2 Fuzzy Logic allows uncertainty in membership functions
(fuzziness within fuzziness)
📌 Example:
In a noisy environment (e.g., voice recognition), a word may have
multiple fuzzy interpretations.
Type-2 Fuzzy Systems handle this extra uncertainty better than Type-1.
✅ Used in: Robotics, AI with highly uncertain data.
Fuzzy Neural Networks (FNN)
• Combines Fuzzy Logic with Neural Networks (AI)
• The neural network learns fuzzy rules automatically
📌 Example:
• A self-driving car can use Fuzzy Logic to handle smooth speed
control.
• A Neural Network can learn how human drivers use fuzzy concepts
like "Drive Fast" or "Slow Down".
✅ Used in: AI, Machine Learning, Predictive Systems.
3. Fuzzy Clustering (Fuzzy C-Means
Algorithm)
• Unlike traditional clustering (K-Means), Fuzzy Clustering allows soft
boundaries
• A data point can belong to multiple clusters with different degrees
📌 Example:
• A person partly belongs to the "Young" and "Middle-Aged" category
• Instead of hard classification, Fuzzy Clustering assigns membership
degrees
✅ Used in: Image Processing, Medical Diagnosis, Data Mining.
4. Adaptive Fuzzy Systems (Self-
Tuning)
• Fuzzy Logic parameters (rules & membership functions) adapt over
time.
• These systems "learn" from new data and improve accuracy.
📌 Example:
• An air conditioner adjusts rules over time based on human
feedback.
• ✅ Used in: Smart Devices, AI-powered Fuzzy Control.
Neuro-Fuzzy Logic
Neural Networks
How
Neurons
Work?
Neuro-Fuzzy 👉 Neuro-Fuzzy Systems combine Artificial Neural Networks
(ANNs) and Fuzzy Logic to create adaptive, learning-based
Logic fuzzy systems.

Neural Networks
Feature (ANNs) Fuzzy Logic Neuro-Fuzzy (Hybrid)

Learns fuzzy rules from


Learning Ability Learns from data Fixed rules
data
Keeps fuzzy
Interpretability Hard to interpret Easy to explain explainability
Adapts fuzzy rules
Adaptability Adapts but no logic Fixed rules dynamically
Image, speech, Expert systems, AI systems that need
Best Used In deep learning decision-making learning + reasoning

You might also like