[go: up one dir, main page]

0% found this document useful (0 votes)
22 views4 pages

Simple Neural Network - Ipynb

The document outlines an experiment involving a simple neural network implemented in Python, specifically a single-layer perceptron. It describes the structure and function of the perceptron, including the training process and weight adjustments. The code provided demonstrates the initialization, training, and testing of the neural network with a sample dataset.

Uploaded by

hhi142613
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views4 pages

Simple Neural Network - Ipynb

The document outlines an experiment involving a simple neural network implemented in Python, specifically a single-layer perceptron. It describes the structure and function of the perceptron, including the training process and weight adjustments. The code provided demonstrates the initialization, training, and testing of the neural network with a sample dataset.

Uploaded by

hhi142613
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 4

{

"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": [],
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
}
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/github/PadmaJyothi-U/Deep-
Learning/blob/main/Simple_Neural_Network.ipynb\" target=\"_parent\"><img
src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In
Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"source": [
"# **Experiment-3**\n",
"\n",
"---\n",
"\n"
],
"metadata": {
"id": "lwotv4L0gMyn"
}
},
{
"cell_type": "markdown",
"source": [
"# **Simple Neural Network**"
],
"metadata": {
"id": "cqrNe3XuhZsJ"
}
},
{
"cell_type": "markdown",
"source": [
"\n",
"The simplest type of neural network is a single-layer perceptron. It
consists of one input layer and one output layer, with no hidden layers. The
perceptron takes a set of input features and produces a binary output based on a
weighted sum of the inputs. The weights are adjusted during training to optimize
the output."
],
"metadata": {
"id": "PXF2MXSXo3PD"
}
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "pzyjCDomJteF"
},
"outputs": [],
"source": [
"import numpy as np"
]
},
{
"cell_type": "code",
"source": [
"class NeuralNetwork():\n",
" \n",
" def __init__(self):\n",
" # seeding for random number generation\n",
" np.random.seed(1)\n",
" \n",
" #converting weights to a 3 by 1 matrix with values from -1 to 1
and mean of 0\n",
" self.synaptic_weights = 2 * np.random.random((3, 1)) - 1\n",
"\n",
" def sigmoid(self, x):\n",
" #applying the sigmoid function\n",
" return 1 / (1 + np.exp(-x))\n",
"\n",
" def sigmoid_derivative(self, x):\n",
" #computing derivative to the Sigmoid function\n",
" return x * (1 - x)\n",
"\n",
" def train(self, training_inputs, training_outputs,
training_iterations):\n",
" \n",
" #training the model to make accurate predictions while adjusting
weights continually\n",
" for iteration in range(training_iterations):\n",
" #siphon the training data via the neuron\n",
" output = self.think(training_inputs)\n",
"\n",
" #computing error rate for back-propagation\n",
" error = training_outputs - output\n",
" \n",
" #performing weight adjustments\n",
" adjustments = np.dot(training_inputs.T, error *
self.sigmoid_derivative(output))\n",
"\n",
" self.synaptic_weights += adjustments\n",
"\n",
" def think(self, inputs):\n",
" #passing the inputs via the neuron to get output \n",
" #converting values to floats\n",
" \n",
" inputs = inputs.astype(float)\n",
" output = self.sigmoid(np.dot(inputs, self.synaptic_weights))\n",
" return output\n"
],
"metadata": {
"id": "mmFVZrNxJvCS"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"if __name__ == \"__main__\":\n",
" #Intialise a single neuron neural network.\n",
" neural_network = NeuralNetwork()\n",
" print(\"Random starting synaptic weights: \")\n",
" print(neural_network.synaptic_weights)\n",
"\n",
" # The training set. We have 4 examples, each consisting of 3 input
values\n",
" # and 1 output value.\n",
" training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1,
1]])\n",
" training_set_outputs = array([[0, 1, 1, 0]]).T\n",
"\n",
" # Train the neural network using a training set.\n",
" # Do it 10,000 times and make small adjustments each time.\n",
" neural_network.train(training_set_inputs, training_set_outputs, 10000)\
n",
"\n",
" print(\"New synaptic weights after training: \")\n",
" print(neural_network.synaptic_weights)\n",
"\n",
" # Test the neural network with a new situation.\n",
" print(\"Considering new situation [1, 0, 0] -> ?: \")\n",
" print(neural_network.think(array([1, 0, 0])))"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "sMasE_W3gElA",
"outputId": "a6b2f8d5-5262-408b-e057-1631c6ffbaaa"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Random starting synaptic weights: \n",
"[[-0.16595599]\n",
" [ 0.44064899]\n",
" [-0.99977125]]\n",
"New synaptic weights after training: \n",
"[[ 9.67299303]\n",
" [-0.2078435 ]\n",
" [-4.62963669]]\n",
"Considering new situation [1, 0, 0] -> ?: \n",
"[0.99993704]\n"
]
}
]
},
{
"cell_type": "code",
"source": [],
"metadata": {
"id": "d70_WGP7gInI"
},
"execution_count": null,
"outputs": []
}
]
}

You might also like