0 ratings0% found this document useful (0 votes) 641 views23 pagesAIML Lab Manual
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
ESTD. 2001
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF COMPUTER SCIENCE ANRNGINEERING
Y
CY
&
LAB MANUAL
ss
foro’
CS3491-ARTIFICIAL INTELLIGENCE |ACHINE LEARNING LABORATORY
OS
F
(Regulation 2021, IV Semester)
even Semester)
x
foam YEAR: 2023 — 2024
PREPARED BY
MEENA,
Assistant Professor / CSECOURSE OBJECTIVE!
1. Study about uninformed and Heuristic search techniques.
2. Learn techniques for reasoning under uncertainty
3. Introduce M:
4. Study about ensembling and unsupervised learning algorithms
ine Learning and supervised learning algorithms
5. Lean the basics of deep learning using neural networks
EXPERIMENTS LIST
1. Implementation of Uninformed search algorithms (BFS, DFS)
2. Implementation of Informed search algorithms (A*, memoxPinded A*)
3. Implement naive Bayes models
4, Implement Bayesian Networks &
5. Build Regression models &
6. Build decision trees and random f we
7. Build SVM models &
8. Implement ensembling, Bones
9. Implement cise its
10. Implemei a
for Bayesian networks
11. Build Spie NN models
12, Build deep learning NN modelsCOURSE OUTCOMES:
On completion of the course, students will be able to:
COI: Use appropriate search algorithms for problem solving
CO2: Apply reasoning under uncertainty
CO3: Build supervised learning models
(C04: Build ensembling and unsupervised modelsCOS:
Build deep learning neural network models
SY
TEXT BOOKS: S
1. Stuart Russell and Peter Norvig, “Artificial Intelligence — A M@deti Approach”, Fourth
Edition, Pearson Education, 2021. ss
2. Ethem Alpaydin, “Introduction to Machine Lea) MIT Press, Fourth Edition,
2020. }
' S
REFERENCES: ©)
1. Dan W. Patterson, “Introduction to sit Intelligence and Expert Systems”, Pearson
Education,2007
2. Kevin Night, Elaine Rich, an NB, “Antica Intelligence”, MeGraw Hill, 2008
3. Patrick H. Winston, "Artificka{Jatelligence’, Third Edition, Pearson Education, 2006
4. Deepak Khemani, “Arti elligence”, Tata McGraw Hill Education, 2013
(htip:/inpte
5. Christopher M. Bi§hop, “Pattern Recognition and Machine Learning”, Springer, 2006,
6. Tom MitcheleSachine Learning”, McGraw Hill, 3rd Edition, 1997.
7. Charu C. Aggtrwal, “Data Classification Algorithms and Applications”, CRC Press, 2014
8, Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine
Learning”, MIT Press, 2012.
9. lan Goodfellow, Yoshua Bengio, Aaron Courville,
‘Deep Learning”, MIT PressINDEX
1. Implementation of Uninformed search algorithms (BFS, DFS)
Aim:
To implement uninformed search algorithms such as BFS and DFS.
Algorithm:
Step 1:= Initialize an empty li
traversal.
Step 2:= Initialize an empty queue called ‘queue’ to keep track of the nodes to be traversed in
the future.
Step 3:= Add the starting node to the ‘visited’ list and the ‘queue’.
Step 4:= While the ‘queue’ is not empty, do the following:
et tcnt
called ‘visited’ to keep track of the nodes visited during the
a, Dequeue the first node from the ‘queue’ and store it in a variable call
b. Print ‘current’
¢. For each of the neighbours of ‘current’ that have not been vi << do the following:
i, Mark the neighbour as visited and add it to the ‘queue’.
Step 5:= When all the nodes reachable from the starting node hye been visited, terminate
the algorithm,
Breadth First Search,
isited = i
queue =
def oisehited Be
visited.append(
queue erode)
while queue:
‘m= queue.pop(0)
print (m, end ="")
for neighbour in graph{m}
if neighbour not in visited:
visited append(neighbour)
queue.append(neighbour)
print("*Following is the Breadth-First Search")
bfs(visited, graph, '5')
Output:
Following is the Breadth-
537248
t SearchDepth first Search:
Algorithm:
Initialize an empty set called ‘visited’ to keep track of the nodes visited during the
Define a DFS function that takes the current node, the graph, and the 'visited' set as
If the current node is not in the ‘visited’ set, do the following:
a, Print the current node.
b. Add the current node to the ‘visited! set.
¢. For each of the neighbours of the current node, call the DFS function recursively with
the neighbour as the current node.
Step 4:= When all the nodes reachable from the starting node have been (‘terminate
the algorithm, <
\Y
ea O
visited = set()
def dfs(visited, graph, node):
if node not in visited:
print (node)
visited.add(node)
for neighbour in gray
dis(visited, gr:
print("Followin, Depth-First Search")
dis(visited, grapl
Output:
Following is the Depth-First Search
Aeeren
Result:Thus the uninformed search algorithms such as BFS and DFS have been executed
successfully and the output got verified2. Implementation of Informed search algorithm (A*)
Aim:
To implement the informed search algorithm A*.
Algorithm:
1. Initialize the distances dictionary with float(inf) for all vertices in the graph except
for the start vertex which is set to 0.
2. Initialize the parent dictionary with None for all vertices in the graph.
3. Initialize an empty set for visited vertices.
4. Initialize a priority queue (pq) with a tuple containing the sum of the heuristic value
and the distance from start to the current vertex, the distance from start to the current
vertex, and the current vertex. ow
5. While pq is not empty, do the following:
a, Dequeue the vertex with the smallest f-distance (s mefte heuristic value
and the distance from start to the current vertex). ore
u
b. If the current vertex is the destination verte:
¢. If the current vertex has not been visited, oes he visited set.
d. For each neighbor of the current vers following:
i. Calculate the distance from start weighbor (g) as the sum of the
distance from start to the current vertex afd'the edge weight between the current
vertex and the neighbor.
ii, Calculate the f-distance (f=
iii, If the f-distance for the
distances dictionary, update,
parent dictionary with the.c
iv. Enqueue the ne}
the neighbor itself i
6. Return distances.
distances and parent.
for the neighbor.
‘or is less than its current distance in the
stances dictionary with the new distance and the
vertex as the parent of the neighbor.
with its distance, distance from start to neighbor, and
\e priority queue.
Program : x
impor heap &
def a_star(graph, start, dest, heuristic):
distances = vertex: float(inf) for vertex in graph} distances{start] = 0
parent = { vertex: None for vertex in graph}
visited = set()
pq =[( 0 +heuristic{start], 0, start)] #E space
while pq:
curr
ccurr_dist, curr_vert = heapq.heappop(pq)
ert not in visited:
ted.add(curr_vert)
for nbor, weight in graph{cur_vert] items():
distance = curr_dist + weight # distance from start (g)
f distance = distance + heuristic{nbor] # f= g +h# Only process new vert if it's f distance is lower
if f_distance < distances[nbor]:
distances[nbor] = f_distance
parent{nbor] = curr_vert
if nbor dest:
# we found a path based on heuristi
return distances, parent
heapg.heappush(pq, (f_distance, distance, nbor)) #logE time
return distances, parent
def generate_path_from_parents(parent, start, dest):
path = []
curr = dest
while curr:
path.append(curr)
curr = parent{curr]
return ‘>’ join(path{::-1])
Gessea
‘A’: 16,distances,parent
print(‘distances => ', distances)
print(‘parent =>’, parent)
Print optimal path =>, generate_path from_parents(parentstartest)
_star(graph, start, dest, heuristic)
Output: vv
YV
1a, Or ah 24, T: 26, ‘I’: 28, 'K’: 28,
nC, rahe: None, 'H’: 'C’, T: 'H’,
»s
<
<
SS
distances => {'
'L 28, 'M': 32, 'N’: 30, 'O': 30, 'P:
parent => {'A': None, 'B':'A','C’
DL: DM’: Di, 'N: T,'0'
K:
Result:
Thus the program to implement informed search algorithm have been executed
successfully and output got verified.3. Implement Naive Bayes models.
Aim:
To diagnose the climate dataset withNaive Bayes Classifier Algorithm,
Algorithm:
. Import the required libraries: numpy, matplotlib.pyplot, pandas, seaborn.
1
2. Load the dataset from the given CSV file "NaiveBayes.csv" using the pandas
"read_esv()" function.
3. Separate the input and output variables from the dataset by using "iloc" and "values"
methods and assign them to "X" and "y" variables respectivel
4, Split the dataset into training and testing datasets using the ok split()”
function from the "sklearn.model_selection” module, Assig data to
"X_train", "X_test", "y_train" and "y_test" variable:
5. Standardize the input data using the "StandardScaler(
sklearn, preprocessing”
tion from the
" module. Scale the training dgta}uhd testing data separately
and assign them to "X_train" and "X_test" va
6. Create a Bernoulli Naive Bayes classifier cae the "BernoulliNB()” function
from the "sklearn.naive_bayes" module an@ssign it to the "classifer” variable.
7. Train the Bernoulli Naive Bayes cl dos the "fit()” method of the
object by passing the "X_train” and Raa variables as arguments.
8._ Predict the output values for the tgstWataset using the "predict()" method of the
"classifier" object and assign th 'y_pred" variable.
9. Calculate the accuracy sconyckffte model by passing the predicted output values
“y_pred” and actual outpytvaities "y_test" to the "accuracy_score()" funetion from
the "sklearn.metrics” ile and print the result.
10. Create a Gaussian Bayes classifier object using the "GaussianNBO)" function
from the "sklearmhaive_bayes” module and assign it to the “classifer1" variable.
11. Train the GaysBigit Naive Bayes classifier using the "fit()" method of the
lassifer1 abject by passing the "X_train" and "y_
put values for the test dataset using the "predict()" method of the
)bject and assign them to "y_pred1" variable.
13, Calcufate the accuracy score of the model by passing the predicted output values
y_predi" and actual output values "y_test” to the "accuracy_score()" function from
the "sklearn.metrics" module and print the result,
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
import seaborn as sns
dataset = pd.read_csv(‘NaiveBayes.csv')
# split the data into inputs and outputs
X = dataset iloc{:, [0,1]].values
y = dataset iloct:, 2].values
from sklearn.model_selection import train_test_split
# assign test data size 25%
X train, X_test, y_train, y_test =train_test_split(X,y,test_size= 0.25, random_state=0)
from sklearn.preprocessing import StandardScaler# scalling the input data
-_X = StandardScaler()
X_train = se_X.fit_transform(X_train)
X_test __X.fit_transform(X_test)
from sklearn.naive_bayes import BernoulliNB
# initializaing the NB
classifer = BernoulliNBO
# training the model
classifer.fit(X_train, y train)
# testing the model
y_pred = classifer.predict(X_test)
from sklearn.metrics import accuracy_score
# printing the accuracy of the model
print(accuracy_score(y_pred, y_test))
from sklearn.naive_bayes import GaussianNB
# create a Gaussian Classifier
cl fer! = GaussianNB()
# training the model
classifer! fit(X_train, y_train)
# testing the model
y_predI = classiferl predict(X_test)
from sklearn.metrics import accuracy_score
# printing the accuracy of the model
print(accuracy_score(y_test,y_pred1))
Result
‘Thus the program wit re Bayes Classifier Algorithm have been executed
successfully and oufpydgot verified
“x
v3. Implement Bayesian Networks
Ai
To construct a Bayesian network, to demonstrate the diagnosis of heart
patientsusing standard Heart Disease Data Set.
Algorithm:
Step 1: Import required modules
Step 2: Define network structure
Step 3: Define the parameters using CPT
Step 4: Associate the parameters with the model structure
Step 5: Check if the cpds are valid for the model
Step 6: View nodes and edges of the model
Step 7: Check independencies of a node
Step 8: List all Independencies
Program:
from pgmpy.models import BayesianNetwork
from pgmpy inference import VariableEliminatioy
# Defining network structure
alarm_model = BayesianNetwork
(
[
elas ne
(‘Earthquake", "Alarm”),
("Alarm", "JohnCalls"
"Alarm", "MaryCi
J
ESS
# Defining Petes using CPT
from pgmpy factors.discrete import TabularCPD
cpd_burglary = TabularCPD(
variable="Burglary”, variable_card=2, values=[{0.999], [0.0011]
)
cpd_earthquake = TabularCPD(
variable="Earthquake", variable_card=2, values={{0.998}, (0.002]]
)
epd_alarm = TabularCPD(
variable="Alarm",
variable_card=2,
values={[0.999, 0.71, 0.06, 0.05}. {0.001, 0.29, 0.94, 0.951],
evidence=["Burglary”, "Earthquake",
evidence_card=[2, 2],
)
cpd_johnealls = TabularCPD(
variable="JohnCalls",
variable_card=2,
values=[[0.95, 0.1], (0.05, 0.9]],evidence=["Alarm"],
evidence_card=[2],
)
¢pd_marycalls = TabularCPD(
variable="MaryCalls",
variable_card:
values=[[0.1, 0.7], [0.9, 0.3]],
evidence=["Alarm"],
evidence_card=(2],
)
# Associating the parameters with the model structure
alarm_model.add_epds(
eeloeg ay cp eartbquake pd lar ep johncalle| ep encjeas
4 Oo
+# Checking if the epds are valid for the model <
alarm_model.check_model() Vv
SY
# Viewing nodes of the model oO
alarm_model.nodes()
# Viewing edges of the model
alarm_model.edges() S
# Checking independcies of a node &
alarm_model.local_independencies("Bi ")
©
# Listing all Independencies RS
s()
alarm_model mEeeD
\
Output: x
me
NodeView((Surglary’, ‘Alarm’, Earthquake’, ‘JohnCalls’, MaryCalls’))
OutEdgeView(((‘Burglary’, ‘Alarm’), (‘Alarm’, ‘JohnCalls’), (‘Alarm’, 'MaryCalls’),
( Earthquake’, ‘Alarm’)|)
(Burglary 4 Earthquake)
(MaryCalls 1 Earthquake, Burglary, JohnCalls | Alarm) (MaryCalls 1. Burglary,
JohnCalls | Earthquake, Alarm)
(MaryCalls 4 Earthquake, JohnCalls | Burglary, Alarm)
(MaryCalls 4. Earthquake, Burglary | JohnCalls, Alarm)
(MaryCalls . JohnCalls | Earthquake, Burglary, Alarm)
(MaryCalls 1 Burglary | Earthquake, JohnCalls, Alarm)
(MaryCalls . Earthquake | Burglary, JohnCalls, Alarm)
(JohnCalls 1 Earthquake, Burglary, MaryCalls | Alarm)
(iohnCalls 1 Burglary, MaryCalls | Earthquake, Alarm)
(GohnCalls Earthquake, MaryCalls | Burglary, Alarm)
(JohnCalls L Earthquake, Burglary | MaryCalls, Alarm)
GohnCalls 1. MaryCalls | Earthquake, Burglary, Alarm)(iohnCalls 1 Burglary | Earthquake, MaryCalls, Alarm)
(GiohnCalls 1 Earthquake | Burglary, MaryCalls, Alarm)
(Earthquake Burglary)
(Earthquake MaryCalls, JohnCalls | Alarm)
(Earthquake MaryCalls, JohnCalls | Burglary, Alarm)
(Barthquake JohnCalls | MaryCalls, Alarm)
(Earthquake . MaryCalls | JohnCalls, Alarm)
(Earthquake JohnCalls | Burglary, MaryCalls, Alarm)
(Earthquake 4 MaryCalls | Burglary, JohnCalls, Alarm)
(Burglary 1 Earthquake)
(Burglary 4 MaryCalls, JohnCalls | Alarm)
(Burglary 1. MaryCalls, JohnCalls | Earthquake, Alarm)
(Burglary JohnCalls | MaryCalls, Alarm)
(Burglary 4. MaryCalls | JohnCalls, Alarm)
(Burglary JohnCalls | Earthquake, MaryCalls, Alarm)
(Burglary 1 MaryCalls | Earthquake, JohnCalls, Alarm)
Y
\V
oe
Result: s
‘Thus the program to implement a bayesian caserts have been executed successfully
and the output got verified. Ss
»~4, Build Regression models
Aim:
To build regression models such as locally weighted linear regression and plot the
necessary graphs.
Algorithm:
1. Read the Given data Sample to X and the curve (linear or non-linear) to Y
2, Set the value for Smoothening parameter or Free parameter say t
3. Set the bias /Point of interest set x0 which is a subset of X
4, Determine the weight matrix using :
(x-Xo )2 x
W(x%,X%)) =e 2
5. Determine the value of model term parameter B using
Bo) = (ATW BRT Wy
Ss
6. Prediction = x0*B. <
¢
Program:
from math import cel
import numpy as np ©
from scipy import linalg wr
2
def lowess(x, y, f, iterations
n
len(x)
int(ceil(f * n)) x
he Inpsontong x{i])){¢] for i in range(n))
w= np.clip( 95((x[:, None] - x[None, :]) / h), 0.0, 1.0)
w= (1-w""3) "3
yest = np.zeros(n)
delta = np.ones(n)
for iteration in range(iterations):
for iin range(n):
weights = delta * w[;, i]
b = np.array({np.sum(weights * y), np.sum(weights * y * x)])
A= np.array({[np.sum(weights), np.sum(weights * x)],[np.sum(weights * x),
np.sum(weights * x * x)]])
beta = linalg.solve(A, b)
beta(0] + beta(1} * xi]
yest
= np.median(np.abs(residuals))delta = np.clip(residuals / (6.0 * s),-1, 1)
delta = (1- delta ** 2) ** 2
return yest
import math
n= 100
x= np.linspace(0, 2 * math.pi, n)
y=np.sin(x) + 0.3 * np.random.randn(n)
20.25
iterations=3
yest = lowess(x, y, f, iterations)
import matplotlib.pyplot as plt
plt.plot(xy,"r.")
plt.plot(x,yest,"b-")
Output:
15
Result
Thus the program to implement non-parametric Locally Weighted Regression
algorithm in order to fit data points with a graph visualization have been executed
successfully.5. Build decision trees and random forests.
Aim:
To implement the concept of decision trees with suitable dataset from real world
problems using CART alg¢
Algorithm:
Steps in CART algorithm:
1. It begins with the original set S as the root node.
2. On each iteration of the algorithm, it iterates through the very unused attribute of
the set S and calculates Gini index of this attribute.
3. Gini Index works with the categorical target variable "Success oe
performs only Binary splits.
4. The set S is then split by the selected attribute to produc set of the data,
5. The algorithm continues to recur on each subset, aot only attributes never
selected before.
lure”. It
Program:
import numpy as np.
import matplotlib. pyplot as plt
import pandas as pd
data = pd.read_csv('Social_Network &S
data-head() we
feature_cols = [‘Age’, 'Est Briar
data iloct:, [2, 3
= data.ilocl:, 4].val
from sklearn. i \_test_s
x train, x_test, train, y_test = train_test_split(x,
(x, y, test_size=0.25, random_state=0)
from sklearn.preprocessing import StandardScaler
sc_x= StandardScaler()
x train = sc_x.fit_transform(x_train)
x_test = sc_x.transform(x_test)
from sklearn.tree import DecisionTreeClassifier
classifier = DecisionTreeClassifier()
classifier = classifier fit(x_train, y_train)
y_pred = classifier.predict(x_test)from sklearn import metrics
print('Accuracy Score:', metrics.accuracy_score(y_test, y_pred))
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred)
print(cm)
from matplotlib.colors import ListedColormap
x set, y_set =x test, y_test
x1, x2 = np.meshgrid(np.arange(start=x_set[;, 0].min()-1, stop=x_sett:, O].max()+1,
step=0.01), np.arange(start=x_set|:, 1].min()-1, stop=x_setl:, 1].max(}+1, step=0.01))
plt.contourf(x1,x2, classifier.predict(np.array({x1.ravel(), x2.ravel())).1).reshape(x1.shape),
alpha=0.75, cmap=ListedColormap((“red", “green")))
plt.xtim(x1.min(), x1.max())
plt.ylim(x2.min(), x2.max())
for i, j in enumerate(np.unique(y_set]): xy
pltscatter(x_setly_set ==j, 0], x setly_set ==j, 1], c-listedColofmap(("red", "green"))i),
label=j)
plt.title("Decision Tree(Test set)")
plt.xlabel("Age")
pit.ylabel("Estimated Salary")
plt.legend()
plt.show()
from sklearn.tree import export. ere
from six import StringlO
from IPython.display import
import pydotplus \
dot_data = StringlO{
export_graphviz
rue,
‘0','1'])
ifier, out_file=dot_data, filled=True, rounded:
'S*True, feature_names=feature_cols, class_names=|
graph = pydotplus.graph_from_dot_data(dot_data.getvalue())
Image(graph.write_png(‘decisiontree.png'))
ecisionTreeClassifier(crit
lassifier.fit(x_train, y_train)
y_pred = classifier.predict(x_test)
print("Accuracy:", metrics.accuracy_score(y_test, y_pred))
"gini", max_depth=3)
dot_data = StringlO()
export_graphviz(classifier, out_file=dot_data, filled=True, rounded:
special_characters=True, feature_names=feature_cols, class_names=|
graph = pydotplus.graph_from_dot_data(dot_data.getvalue())
Image(graph.write_png('opt_decisiontree_gini.png'))
Tue,
‘0','1'])Output of decision tree without pruning:
Result:
Thus the program to implement the concept of decision trees with suitable datasetfrom
real world problems using CART algorithm have been executed successfully.