[go: up one dir, main page]

0% found this document useful (0 votes)
8 views15 pages

Artificial Neural Network

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1/ 15

Artificial Neural Network

Perceptron

▪ Simple Classifier
▪ Perceptron
▪ Weights & Bias
▪ Perceptron Learning

2
Example Dataset
height (x1) width (x2) Class (y)
0.5 -0.35 C1
?
0.35 -0.3 C1
-0.4 0.2 C2 x1 ?
-0.5 0.1 C2
-0.22 -0.5 ?? ?

x2

Can we find a linear combination of input feaures (x1,x2) for classification (y) ?

3
Simple classifier
 x1 and x2 are normalized features of an instance.
 y = f(x) is the class label (Class C1 or Class C2)
 A weighted sum can be
Can f(x) be by x1 and
obtained
formed fromx2a linear combination of x’s ?
 Consider a data x1
I1: x1 = 0.5, x2 = -0.35
y = C1 / C2 ?
 If weights are considered as, w1 = 0.75 and w2 = 0.25
 weighted sum is
x2

w1*x1 + w2*x2 = 0.75*0.5 + 0.25*-0.35 = 0.2875

4
Simple classifier
 y = f(x), where y is defined as a step function for
decision 𝟏 if, u≥
𝒇(𝒖) =
0.1

𝟎 if, u< 0.1
weighted sum (u): w1*x1 + w2*x2 = 0.75*0.5 + 0.25*-0.35 = 0.2875
 In our example, u = 0.2875, so y = 1.
=> Input data (I1: x1 = 0.5, x2 = -0.35) is classified as class C1.
 For another data (I3), if u = -0.25, thus y = 0.
 I3 is classified as class C2.
 For another data (I9), if u = 0.0545, thus y = 0.
 I9 is classified as class C2.

5
Simple classifier
 A bias b (=-1) is added to weighted sum (u) to shift the threshold to 0.
x1 x2 b y
weighted sum (u): w1*x1 + w2*x2 + b
0.5 -0.35 -0.1 C1
= 0.75*0.5 + 0.25*-0.35 + -0.1 0.35 -0.3 -0.1 C1
= 0.1875 -0.4 0.2 -0.1 C2
-0.5 0.1 -0.1 C2

 Now, the step function y = f(x) may look like


1
𝟏 if, v≥ 0
𝒇 𝒖+𝒃 =𝒇 𝒗 = y
𝟎 if, v< 0

0
For (I1), v = 0.1875. y = C1.
For (I3), v = -0.35. y = C2.
For (I9), v = -0.0455. y = C2. v
0
6
Simple classifier – Another Example
Problem: Whether a credit card holder (Y) will be defaulter or not? Given the features are
a) Married (yes/no) b) Own car (yes/no) c) Job (yes/no)

Input
nodes Black box
X1 X2 X3 Y
1 0 0 -1 Output
1 0 1 1 X1 0.3 node
1 1 0 1
1 1 1 1 0.3
0 0 1 -1
X2  Y
0 1 0 -1
0 1 1 1
0 0 0 -1 X3 0.3 t=0.4

7
Neural Networks
• Artificial neural networks (ANNs) developed to mimic the processing of neuron in
the human brain and are designed to recognize patterns

• ANN is the Backbone of Deep Learning

Biological Neurons with connectivity pattern Neuron in ANN with regular pattern

“Fully Connected” Layers

“3-layer Neural Net”, or “2-hidden-layer Neural Net”


Image from CCO
8
Brain Neuron

B
A
C

9
Neural Network

Ouput
S
Input y
Signals n
From
Other
a
Neurons
p
s

10
Artificial Neural Network
(ANN)

1. Neuron Model
 Basic processing unit.

2. Architecture
 Connection of multiple neurons

3. Learning Algorithm
 Update weight to model a specific task.

11
Artificial Neuron Model
• The basic computational element is one called a node or unit. It
receives input from some other units, or from an external source.

• Each input has an associated weight w, which can be modified to


train the model learning.

• Weight of an edge determines the strength of connection between


the nodes

• The unit computes a function of the weighted sum of its inputs


𝒚𝟏
𝒙𝟏
𝒘𝟏

𝒙𝟐 𝒘𝟐
෍ 𝒇

𝒘𝒏
𝒙𝒏 yi = f(neti)
12
Perceptron

• Simplest ANN: Perceptron (single neuron)

• The perceptron is used for binary


classification.
• Learns linear decision boundaries
 In 2 dimension feature space, the separator is a
line.
14
Architecture of Perceptron

Input Synapse Cell Body/Soma Axon

𝒙𝟏
𝒘𝟏

𝒏
𝒘𝟐 𝒏
𝒙𝟐 ෍ 𝒙 𝒋𝒘 + 𝒃 𝒇 𝒚
𝒇 ෍ 𝒙𝒋 𝒘 𝒋 + 𝒃
𝒋 =𝟏 𝒋
𝒘𝒏 𝒋=𝟏

𝒙𝒏

Bias 𝒃

15
Geometrical Interpretation

Geometrical Interpretation of Perceptron

16

You might also like