[go: up one dir, main page]

0% found this document useful (0 votes)
10 views6 pages

DL Lab Handout 3

This lab handout from the Sukkur Institute of Business Administration covers the implementation of neural networks, including single-layer perceptrons and deep learning models using TensorFlow/Keras. Students will learn to classify binary data, implement backpropagation, and build regression models, while also completing exercises and answering questions related to neural network concepts. The lab requires specific hardware and software setups, including Python and relevant libraries.

Uploaded by

Erfaan Mughal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views6 pages

DL Lab Handout 3

This lab handout from the Sukkur Institute of Business Administration covers the implementation of neural networks, including single-layer perceptrons and deep learning models using TensorFlow/Keras. Students will learn to classify binary data, implement backpropagation, and build regression models, while also completing exercises and answering questions related to neural network concepts. The lab requires specific hardware and software setups, including Python and relevant libraries.

Uploaded by

Erfaan Mughal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Sukkur Institute of Business Administration University

Department of Computer Systems Engineering


Deep Learning
Handout # 03:
Introduction to Neural Networks: Perceptron, Backpropagation, and
Deep Learning

Lab Conduction Date: ___________________

Instructor: Engr. Muhammad Irfan Younas

Note: Submit this lab hand-out in the next lab with attached solved activities and exercises

S. No. Criterion 0.5 0.25 0.125 Score

1  Accuracy  Desired output  Minor  Critical


Mistake Mistake

2  Timing  Submitted within  1 day late  More than 3


the given time Days
Total Score Achieved

Submission Profile

Name: Submission date:

Marks obtained: Receiving authority name and signature:

Comments:

________________________________________________________________________________

Instructor Signature
Learning Outcomes
After completing this lab, students will be able to:
1. Implement a single-layer perceptron for binary classification and understand its
limitations.
2. Implement a neural network from scratch using NumPy, including forward and
backward propagation.
3. Build a deep neural network model for regression using TensorFlow/Keras.
4. Train a multi-layer perceptron (MLP) to classify non-linearly separable data (XOR
problem).

Lab Hardware and Software Requirements


Hardware Requirements:
• A computer with at least:
o Processor: Intel Core i3 or higher (or equivalent)
o RAM: 8 GB or higher
o Storage: 20 GB free disk space
o Operating System: Windows 10/11, macOS, or Linux
Software Requirements:
• Python 3.x (latest stable version)
• Jupyter Notebook (via Anaconda) or Google Colab
• NumPy, Pandas, Matplotlib, and Scikit-Learn libraries
• TensorFlow and Keras libraries

Lab Activities
Task 1: Implementing a Single-Layer Perceptron
Step 1: Import Required Libraries
import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import Perceptron
from sklearn.metrics import accuracy_score
Step 2: Define AND, OR, and XOR datasets
# Define input features and target labels
X = np.array([[0,0], [0,1], [1,0], [1,1]])
2
AND_y = np.array([0, 0, 0, 1])
OR_y = np.array([0, 1, 1, 1])
XOR_y = np.array([0, 1, 1, 0])
Step 3: Train and Test the Perceptron
# Train Perceptron on AND data
perceptron = Perceptron()
perceptron.fit(X, AND_y)
AND_predictions = perceptron.predict(X)
print(f"AND Gate Accuracy: {accuracy_score(AND_y, AND_predictions)}")
Repeat for OR and XOR:
# OR Gate
perceptron.fit(X, OR_y)
OR_predictions = perceptron.predict(X)
print(f"OR Gate Accuracy: {accuracy_score(OR_y, OR_predictions)}")

# XOR Gate (expected to fail)


perceptron.fit(X, XOR_y)
XOR_predictions = perceptron.predict(X)
print(f"XOR Gate Accuracy: {accuracy_score(XOR_y, XOR_predictions)}")
Observation:
• The perceptron successfully classifies AND and OR.
• The perceptron fails for XOR due to its non-linearity.

Task 2: Implementing a Neural Network from Scratch (NumPy)


Step 1: Define the Activation Function and Derivative
def sigmoid(x):
return 1 / (1 + np.exp(-x))

def sigmoid_derivative(x):
return x * (1 - x)
Step 2: Initialize Network Parameters
np.random.seed(42)
3
X_train = np.array([[0,0],[0,1],[1,0],[1,1]])
y_train = np.array([[0], [1], [1], [0]])

# Initialize weights
weights_input_hidden = np.random.rand(2, 2)
weights_hidden_output = np.random.rand(2, 1)
bias_hidden = np.random.rand(1, 2)
bias_output = np.random.rand(1, 1)
Step 3: Train the Neural Network
learning_rate = 0.1
epochs = 10000

for epoch in range(epochs):


# Forward pass
hidden_layer_input = np.dot(X_train, weights_input_hidden) + bias_hidden
hidden_layer_output = sigmoid(hidden_layer_input)

final_input = np.dot(hidden_layer_output, weights_hidden_output) + bias_output


final_output = sigmoid(final_input)

# Compute error
error = y_train - final_output

# Backpropagation
d_output = error * sigmoid_derivative(final_output)
d_hidden = np.dot(d_output, weights_hidden_output.T) *
sigmoid_derivative(hidden_layer_output)

# Update weights and biases


weights_hidden_output += np.dot(hidden_layer_output.T, d_output) * learning_rate
weights_input_hidden += np.dot(X_train.T, d_hidden) * learning_rate
bias_output += np.sum(d_output, axis=0, keepdims=True) * learning_rate
4
bias_hidden += np.sum(d_hidden, axis=0, keepdims=True) * learning_rate
Observation:
• This multi-layer neural network successfully classifies XOR.

Task 3: Building a Deep Neural Network for Regression (TensorFlow/Keras)


Step 1: Import Required Libraries
import tensorflow as tf
from tensorflow import keras
Step 2: Build and Train the Model
model = keras.Sequential([
keras.layers.Dense(64, activation='relu', input_shape=(2,)),
keras.layers.Dense(64, activation='relu'),
keras.layers.Dense(1)
])

model.compile(optimizer='adam', loss='mse', metrics=['mae'])


model.fit(X_train, y_train, epochs=100, verbose=1, validation_split=0.2)
Step 3: Evaluate the Model
loss, mae = model.evaluate(X_train, y_train)
print(f"Test Mean Absolute Error: {mae}")

Task 4: Classifying XOR using MLP in TensorFlow


Step 1: Define and Train an MLP
mlp_model = keras.Sequential([
keras.layers.Dense(4, activation='relu', input_shape=(2,)),
keras.layers.Dense(1, activation='sigmoid')
])

mlp_model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])


mlp_model.fit(X_train, y_train, epochs=1000, verbose=1)
Step 2: Evaluate the Model
accuracy = mlp_model.evaluate(X_train, y_train)[1]
5
print(f"XOR Classification Accuracy: {accuracy}")
Observation:
• The MLP correctly classifies XOR, demonstrating the power of hidden layers.

Lab Questions

1. What is the main limitation of a single-layer perceptron?

2. Why do we need activation functions like sigmoid or ReLU in neural networks?

3. What does the fit() function do in Keras models?

4. What is the significance of gradient descent in training neural networks?

5. How does a deep neural network improve regression performance?

Lab Exercises

1. Modify the perceptron model to classify an OR gate.

2. Modify the neural network to classify a XOR gate with two hidden layers.

You might also like