[go: up one dir, main page]

0% found this document useful (0 votes)
72 views4 pages

CSPC - 204

This document contains instructions and questions for an exam on machine learning. It includes 8 questions related to topics like maximum likelihood estimation, decision trees, naive Bayes classification, clustering, bootstrapping, version spaces, issues in decision tree learning, and backpropagation in neural networks. Students are asked to find estimators, draw and explain decision trees, compute probabilities, cluster data points, describe algorithm time complexities and elements of version spaces, discuss issues in decision trees, prove properties of feature selection in decision trees, and apply the backpropagation algorithm to a neural network.

Uploaded by

Anubhav Donderia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views4 pages

CSPC - 204

This document contains instructions and questions for an exam on machine learning. It includes 8 questions related to topics like maximum likelihood estimation, decision trees, naive Bayes classification, clustering, bootstrapping, version spaces, issues in decision tree learning, and backpropagation in neural networks. Students are asked to find estimators, draw and explain decision trees, compute probabilities, cluster data points, describe algorithm time complexities and elements of version spaces, discuss issues in decision trees, prove properties of feature selection in decision trees, and apply the backpropagation algorithm to a neural network.

Uploaded by

Anubhav Donderia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Roll No..

Dr B R Am bedkar National Institute of Technology,


Jalandhar B Tech (Computer Science and Engineering) — 4th
Semester
CSPC—204, Machine Learning
End Semester Examination, May 2021
Duration: 02 Hours Max. Marks: 40 Date: 11th Mav 2021
Question 1 2 3 4 5 6 7 8
Number
Marks 6 8 5 5 3 6 3 4
CO No. Jag 2
Learning 12 12 6 22
Level
34 5
3 4
Note:
I Attempt all the questions.
2. Write the answers in hard copy (on A4 sheet) using blue/black pen with
your signature on the top left and page number on the top right corner
of each page of the answer booklet.
3. The time allowed for writing examination is 02 hours. Extra 15 minutes
are allowed for preparing the PDF file of Answer Booklet and
submitting it.
4. Follow the instructions regarding submission of the answer booklet as
issued by the examination section.
5. If you are assuming something, then clearly mention your assumptions.

1. Let x], x2, , xn be independent samples from the following


distribution P(x19) = 9x -e - 1,where 9 > 1, x 2 1
Find the maximum likelihood estimator of 9.

2. The Fl Mercedes team wants to investigate whether the cars will


qualify for a particular Grand Prix based on some binary features.
Table 1 shows the feature of five cars. The features are Body,
Engine, and Wheel. The Prediction column shows the labeled output.
(a) Compute the entropy H (Prediction) at the root node? [2]

1
(b) Draw the decision tree. Explain with numbers why do you
choose the splits.
(c) M/'hat is the worst-case running time to build the decision tree,
and why? [21
Engine Body Prediction

Car 1 Success Failure Failure Disqualify


Car 2 Success Failure Success Qualify
Car 3 Failure Success Failure Qualify
Car 4 Success Success Success Qualify
Car 5 Success Success Failure Disqualify
Table — 1: Data set of the Team Mercedes in Fl
3. A data set is given in Table — 2. Apply the candidate elimination
algorithm to the given data sequence in Table — 2. You need to find
the general hypothesis (G) and specifie hypothesis (S) after each
step.
151
Engine Body Prediction

Car 1 Success Success Success Qualify


Car 2 Success Success Failure Qualify
Car 3 Success Failure Success Disqualify
Car 4 Failure Success Success Disqualify
Car 5 Failure Failure Success Disqualify
Car 6 Failure Failure Failure Disqualify
Table — 2: Data set of the Team Mercedes in Fl
4. Consider a classroom with n number of students. The students are
divided into three groups (A, B, and C) based on their height and
weight, separately. Each student's information is also associated
with a prediction that whether the student is selected for NCC or not.
You need to use Naive Bayes classifier. The likelihood table is
given In the table below. You need to find the posterior probability
for data point (A,C), i.e., P(NYIH = A, W = C) and P(NNIH = A, W
= C).
Notation- NY: Selected for NCC, NN: Not selected for NCC.

2
NY NY

A 0.2 0.3 A 0.4 0.1


Weigh
Height B 0.4 0.6 B 0.5 0.3
t
C 0.4 0.1 c 0.1 0.6
5. There are eight data points as DI (2,5), D2 (7,5), D3 (10,2), D4
(1,2), D5 (3,6), D6 (5,8), D7 (4,8), and D8 (9, 1). You need to
cluster these given points into three different clusters.
6. (a) What is the time complexity to create the bootstrapped set in
terms of n, where n is the number of training patterns? Ill (b) What
are the elements of the version space? How are they ordered? What
can be said about the meaning and sizes of S and G?
13
1 (e) Describe different issues in Decision Tree Learning with
examples. 121
7. We split in a decision tree based on the maximum information gain.
Prove or disprove: In any path from the root split to a leaf, the same
feature will never be split twice. 131
8. Apply the backpropagation algorithm to the neural network given
below. Find the weights after the first iteration. Assume that the bias
Bl and B2 are 1, inputs Al and A2 are 0.05 and 0.01, respectively,
and outputs RI and R2 are 0.85 and 0.15, respectively. Clearly
explain each step to get full marks.

3
4

You might also like