[go: up one dir, main page]

0% found this document useful (0 votes)
14 views8 pages

Backpropagation Example

The document explains the One-Hot Encoding method for categorical variables in a dataset, providing examples for 'Outlook', 'Temperature', 'Humidity', and 'Wind'. It includes Python code for a neural network that uses sigmoid activation functions to predict decisions based on input features. The document also contains a second example with a different dataset and similar neural network training and prediction processes.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views8 pages

Backpropagation Example

The document explains the One-Hot Encoding method for categorical variables in a dataset, providing examples for 'Outlook', 'Temperature', 'Humidity', and 'Wind'. It includes Python code for a neural network that uses sigmoid activation functions to predict decisions based on input features. The document also contains a second example with a different dataset and similar neural network training and prediction processes.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

For this dataset:

The best and common method is One-Hot Encoding.

📌 One-Hot Encoding Example for this dataset:

Original column: Outlook

Outlook Outlook_Sunny Outlook_Overcast Outlook_Rain

Sunny 1 0 0

Overcast 0 1 0

Rain 0 0 1

Similarly for Temperature:

Temperature Temp_Hot Temp_Mild Temp_Cool

Hot 1 0 0

Mild 0 1 0

Cool 0 0 1

Humidity:
Humidity Hum_High Hum_Normal

High 1 0

Normal 0 1

Wind:

Wind Wind_Weak Wind_Strong

Weak 1 0

Strong 0 1

Target (Play Tennis):

Play Tennis Label

Yes 1

No 0

import numpy as np

# Sigmoid and its derivative

def sigmoid(x):

return 1 / (1 + np.exp(-x))

def sigmoid_derivative(x):

return x * (1 - x)

# Input dataset (4 features)

# Format: [outlook, temp, humidity, wind]

X = np.array([

[1, 0, 0, 1, 0, 0, 1, 0, 1, 0,],

[1, 0, 0, 1, 0, 0, 1, 0, 0, 1,],
[0, 1, 0, 1, 0, 0, 1, 0, 1, 0,],

[0, 0, 1, 0, 1, 0, 1, 0, 1, 0,],

[0, 0, 1, 0, 0, 1, 0, 1, 1, 0,],

[0, 0, 1, 0, 0, 1, 0, 1, 0, 1,],

[0, 1, 0, 0, 0, 1, 0, 1, 0, 1,],

[1, 0, 0, 0, 1, 0, 1, 0, 1, 0,],

[1, 0, 0, 0, 0, 1, 0, 1, 1, 0,],

[0, 0, 1, 0, 1, 0, 0, 1, 1, 0,],

[1, 0, 0, 0, 1, 0, 0, 1, 0, 1,],

[0, 1, 0, 0, 1, 0, 1, 0, 0, 1,],

[0, 1, 0, 1, 0, 0, 0, 1, 1, 0,],

[0, 0, 1, 0, 1, 0, 1, 0, 0, 1,]

])

# Output decision: 1 = go to movie, 0 = don't go

y = np.array([[0], [0], [1], [1], [1],[0], [1], [0], [1], [1],[1], [1],[1],[0]])

# Seed for reproducibility

np.random.seed(1)

# Initialize weights

input_layer_neurons = X.shape[1] # 4 inputs

hidden_layer_neurons = 5

output_neurons = 1

# Random weights and biases

weights_input_hidden = np.random.uniform(size=(input_layer_neurons, hidden_layer_neurons))

bias_hidden = np.random.uniform(size=(1, hidden_layer_neurons))

weights_hidden_output = np.random.uniform(size=(hidden_layer_neurons, output_neurons))

bias_output = np.random.uniform(size=(1, output_neurons))


# Training parameters

epochs = 50

learning_rate = 0.5

# Training the neural network

for epoch in range(epochs):

# Forward pass

hidden_layer_input = np.dot(X, weights_input_hidden) + bias_hidden

hidden_layer_output = sigmoid(hidden_layer_input)

final_input = np.dot(hidden_layer_output, weights_hidden_output) + bias_output

final_output = sigmoid(final_input)

# Backpropagation

error = y - final_output

d_output = error * sigmoid_derivative(final_output)

error_hidden = d_output.dot(weights_hidden_output.T)

d_hidden = error_hidden * sigmoid_derivative(hidden_layer_output)

# Update weights and biases

weights_hidden_output += hidden_layer_output.T.dot(d_output) * learning_rate

bias_output += np.sum(d_output, axis=0, keepdims=True) * learning_rate

weights_input_hidden += X.T.dot(d_hidden) * learning_rate

bias_hidden += np.sum(d_hidden, axis=0, keepdims=True) * learning_rate

# Final output after training

print("Final output after training:\n", final_output)

# Test the network

def predict(features):
hidden = sigmoid(np.dot(features, weights_input_hidden) + bias_hidden)

output = sigmoid(np.dot(hidden, weights_hidden_output) + bias_output)

return output

# Example: Should I play tennis?

# Outlook = overcast, Temperature = hot, Humidity = high, Windy = strong

import numpy as np

# Format: [outlook, temp, humidity, wind]

test_input = np.array([[0, 1, 0, 1, 0, 0, 0, 1, 1, 0]])

decision = predict(test_input)

#print("\nTest decision (1 = Play, 0 = Don't Play):", decision)

print("Decision:", "Play" if decision >= 0.5 else "Don't Play")

Example2

import numpy as np

# Sigmoid and its derivative

def sigmoid(x):

return 1 / (1 + np.exp(-x))

def sigmoid_derivative(x):

return x * (1 - x)

# Input dataset (4 features)

# Format: [weather, time, money, friends]

X = np.array([

[1, 1, 1, 1],
[1, 0, 1, 0],

[0, 1, 0, 1],

[0, 0, 1, 0],

[1, 1, 0, 1],

[1, 1, 1, 0]

])

# Output decision: 1 = go to movie, 0 = don't go

y = np.array([[1], [1], [0], [0], [1], [1]])

# Seed for reproducibility

np.random.seed(1)

# Initialize weights

input_layer_neurons = X.shape[1] # 4 inputs

hidden_layer_neurons = 3

output_neurons = 1

# Random weights and biases

weights_input_hidden = np.random.uniform(size=(input_layer_neurons, hidden_layer_neurons))

bias_hidden = np.random.uniform(size=(1, hidden_layer_neurons))

weights_hidden_output = np.random.uniform(size=(hidden_layer_neurons, output_neurons))

bias_output = np.random.uniform(size=(1, output_neurons))

# Training parameters

epochs = 2

learning_rate = 0.5

# Training the neural network

for epoch in range(epochs):


# Forward pass

hidden_layer_input = np.dot(X, weights_input_hidden) + bias_hidden

hidden_layer_output = sigmoid(hidden_layer_input)

final_input = np.dot(hidden_layer_output, weights_hidden_output) + bias_output

final_output = sigmoid(final_input)

# Backpropagation

error = y - final_output

d_output = error * sigmoid_derivative(final_output)

error_hidden = d_output.dot(weights_hidden_output.T)

d_hidden = error_hidden * sigmoid_derivative(hidden_layer_output)

# Update weights and biases

weights_hidden_output += hidden_layer_output.T.dot(d_output) * learning_rate

bias_output += np.sum(d_output, axis=0, keepdims=True) * learning_rate

weights_input_hidden += X.T.dot(d_hidden) * learning_rate

bias_hidden += np.sum(d_hidden, axis=0, keepdims=True) * learning_rate

# Final output after training

print("Final output after training:\n", final_output)

# Test the network

def predict(features):

hidden = sigmoid(np.dot(features, weights_input_hidden) + bias_hidden)

output = sigmoid(np.dot(hidden, weights_hidden_output) + bias_output)

return output

# Example: Should I go to the movie?

# Weather = good, Time = yes, Money = no, Friends = yes


test_input = np.array([[1, 1, 0, 1]])

decision = predict(test_input)

print("\nTest decision (1 = Go, 0 = Don't Go):", decision)

print("Decision:", "Go" if decision >= 0.5 else "Don't Go")

You might also like