[go: up one dir, main page]

0% found this document useful (0 votes)
23 views4 pages

Assignment AI

The document outlines an assignment for a course on Introduction to AI and Data Analytics, covering various topics related to probability, Bayesian networks, machine learning algorithms, and support vector machines. It includes specific problems and questions related to coin toss probabilities, children's gender probabilities, disease testing probabilities, optimization algorithms, and the implementation of basic logic gates with perceptrons. The assignment aims to assess understanding of these concepts through practical applications and theoretical explanations.

Uploaded by

igaganvarshney7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views4 pages

Assignment AI

The document outlines an assignment for a course on Introduction to AI and Data Analytics, covering various topics related to probability, Bayesian networks, machine learning algorithms, and support vector machines. It includes specific problems and questions related to coin toss probabilities, children's gender probabilities, disease testing probabilities, optimization algorithms, and the implementation of basic logic gates with perceptrons. The assignment aims to assess understanding of these concepts through practical applications and theoretical explanations.

Uploaded by

igaganvarshney7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Course: MCA Date of Submission: 03 May 2024 Session: 2023-24

Subject (Subject Code): Introduction to AI and Data Analytics (MCAC0022)


Assignment No. 2

1. A box contains three coins: two regular coins and one fake two-headed coin (P(H)=1),

a) You pick a coin at random and toss it. What is the probability that it lands heads up?

b) You pick a coin at random and toss it, and get heads. What is the probability that it is the two-headed
coin?

2. Consider a family that has two children. We are interested in the children's genders. Our sample space
is S={(G,G),(G,B),(B,G),(B,B)}. Also assume that all four possible outcomes are equally likely.

a) What is the probability that both children are girls given that the first child is a girl?

b) We ask the father: "Do you have at least one daughter?" He responds "Yes!" Given this extra
information, what is the probability that both children are girls? In other words, what is the probability
that both children are girls given that we know at least one of them is a girl?

3. A box contains two coins: a regular coin and one fake two-headed coin (𝑃(𝐻)=1). I choose a coin at
random and toss it twice. Define the following events.

 A= First coin toss results in an 𝐻.

 B= Second coin toss results in an 𝐻.

 C= Coin 1 (regular) has been selected.

Find 𝑃(𝐴|𝐶),𝑃(𝐵|𝐶),𝑃(𝐴∩𝐵|𝐶),𝑃(𝐴),𝑃(𝐵), and 𝑃(𝐴∩𝐵).

Note that 𝐴 and 𝐵 are NOT independent, but they are conditionally independent given 𝐶.

4. A certain disease affects about 1 out of 10,000 people. There is a test to check whether the person has
the disease. The test is quite accurate. In particular, we know that

 the probability that the test result is positive (suggesting the person has the disease), given
that the person does not have the disease, is only 2 percent;

 the probability that the test result is negative (suggesting the person does not have the
disease), given that the person has the disease, is only 1 percent.

A random person gets tested for the disease and the result comes back positive. What is the probability
that the person has the disease?
5. The Bayesian network for the problem is given below.

List of all events occurring in this network:


 Burglary (B)
 Earthquake(E)
 Alarm(A)
 David Calls(D)
 Sophia calls(S)
Calculate the probability that alarm has sounded, but there is neither a burglary, nor an earthquake
occurred, and David and Sophia both called the Harry.

Note: Write the events of problem statement in the form of probability: P[D, S, A, B, E], can rewrite
the above probability statement using joint probability distribution.

6. a) Describe the approaches used by Hill Climbing in Artificial Intelligence for optimizing the
mathematical problems.

b) Discuss the problems in Hill Climbing Algorithm and give your views as to which, if any, could be
described as artificially intelligent. Justify your reasoning.

7. (a) Describe the motivation behind the simulated annealing algorithm.

(b) The following table shows six evaluations of a simulated annealing algorithm. For each evaluation
give the probability of the next state being accepted (to 4 dp). Assume the objective function is being
maximised.

Current Evaluation Neighbourhood Evaluation Current Temperature


75 65 25
75 55 25
75 65 50
75 55 50
65 75 25
65 75 50

Ensure you show the formula you use and describe the terms.
(c) One aspect of a simulated annealing cooling schedule is the temperature. Discuss the following.

i) What is the effect of having the starting temperature too high or too low?

ii) How do you decide on a suitable starting temperature?

iii) How do we decide on a final temperature?

8. Why was Machine Learning Introduced? What are Different Types of Machine Learning algorithms?
Explain Supervised, Unsupervised, and Reinforcement Learning with example?

9. Explain the Difference Between: (a) Classification and Regression (b) Artificial learning and machine
learning?

10. What is ‘Overfitting’ in Machine learning? Why overfitting happens? How can you avoid overfitting?

11. In the following questions you will consider a k-nearest neighbor classifier using Euclidean distance
metric on a binary classification task. We assign the class of the test point to be the class of the majority
of the k nearest neighbors. Note that a point can be its own neighbor.

Consider the following sample point and classify it for k = 3 and k=5.

12. Explain the SVM algorithms. What are Support Vectors in SVM?
13. Support vector machines (SVMs) learn a decision boundary leading to the largest margin from both
classes. You are training SVM on a tiny dataset with 4 points shown in Figure.

This dataset consists of two examples with class label-1 (denoted with plus), and two examples with
class label +1 (denoted with triangles).
i. Find the weight vector 𝑤 and bias 𝑏. What’s the equation corresponding to the decision
boundary?
ii. Circle the support vectors and draw the decision boundary.
14. Support vector machines (SVMs) learn a decision boundary leading to the largest margin from both
classes. You are training SVM on a tiny dataset with 8 points < Positively labelled data points (3,1)(3,-
1)(6,1)(6,-1) and Negatively labelled data points (1,0)(0,1)(0,-1)(-1,0) >
i. Find the weight vector 𝑤 and bias 𝑏. What’s the equation corresponding to the decision
boundary?
ii. Circle the support vectors and draw the decision boundary.
15. Support vector machines (SVMs) learn a decision boundary leading to the largest margin from both
classes. You are training SVM on a tiny dataset with 8 points < Positively labelled data points (2,2)(2,-
2)(-2,-2)(-2,2) and Negatively labelled data points (1,1)(1,-1)(-1,-1)(-1,1)>
i. Find the weight vector 𝑤 and bias 𝑏. What’s the equation corresponding to the decision
boundary?
ii. Circle the support vectors and draw the decision boundary.
16. Implementing Basic Logic Gates With Perceptron:
1. AND
2. OR
3. NAND
4. XOR

You might also like