28/01/15
Perceptrons
Linear separability
w0 + w1 x1 + w2 x2 = 0
Hyperplane w0 + w1 x1 + w2 x2 ++ wn xn = 0
dividing the space into two regions
LINEAR SEPARABILITY
28/01/15
x: class I (output = 1)
o: class II (output = -1)
x
x: class I (output = 1)
o: class II (output = -1)
4
Perceptron Model
x1
net w i x i
i 0
O f net
w1
w2
x2
wn
xn
X0
net
f(.)
Output
Hard
limiter
Fig. 3.1 Schematic diagram of Perceptron
Inputs
w
i 0
xi 0
Class
C1
x1
Class C2
Fig. 3.2 Illustration of the hyper plane (in this example, a straight line)
as decision boundary for a two dimensional, two-class patron classification problem.
28/01/15
SLDP
For the Perceptron to function properly, the two classes C1 and C2 must be
linearly separable.
Decision boundary
Cla
ss
Cla
ss
C2
C1
Cla
ss
Cla
ss
C2
C1
(a)
(b)
SLDP
28/01/15
SLDP
28/01/15
28/01/15
10
Algorithm continued..
28/01/15
11
Algorithm continued..
28/01/15
12
Example:
Build the Perceptron network to realize fundamental logic
gates, such as AND, OR and XOR.
28/01/15
13
28/01/15
14
28/01/15
15
28/01/15
16
28/01/15
17
Results
0.5
0.45
0.4
0.35
0.9
0.3
Error
Error
0.8
0.25
0.7
0.2
0.6
0.15
0.5
0.1
0.4
0.05
0.3
0.2
5
6
Number of epochs
10
0.1
0
5
6
Number of epochs
10
28/01/15
18
2.5
Error
1.5
0.5
10
15
20
25
30
Number of epochs
35
40
45
50
19
28/01/15
20
28/01/15
21
SLCP
28/01/15
22
SLCP
28/01/15
23
SLCP
28/01/15
24
SLCP
28/01/15
25
Perceptron Convergence
Theorem
28/01/15
26
Perceptron Convergence
Theorem
28/01/15
27
Perceptron Convergence
Theorem
28/01/15
28
28/01/15
29
Perceptron Convergence
Theorem
28/01/15
30
28/01/15
31
Limitations of Perceptron
28/01/15
32