4/12/2007
EE 561
Communication Theory
Spring 2007
D/A
Conversion
Sample
Direct
Digital
Input
Quantize
Source
Decoder
Source
Encode
Decryption
Encryption
Channel
Decoder
Channel
Encoder
Equalizer
Modulator
4/12/2007
Spring 2007
Channel
2007
Analog
Output
signal
Digital
Output
Demodulator
2/57
Communication Theory
4/12/2007
Channel Coding
4/12/2007
3/57
Code Rate
4/12/2007
Spring 2007
2007
4/57
Communication Theory
4/12/2007
Channel Capacity
R<C
Measured in either:
bits per channel symbol (code rate).
bits per second (bps).
4/12/2007
5/57
Capacity of Unconstrained
Vector-Input AWGN Channel
FG
H
IJ
K
2 RE b
K
log 2
+1
2
No
Minimum Eb/No:
Eb
22R/K 1
>
No
2R / K
4/12/2007
Spring 2007
2007
6/57
Communication Theory
4/12/2007
Where:
f (r | s) =
1
M
E [ f (r | s ) max* f (r | s' )]
M
i =1
1
r s
No
s 'S
Comments:
Can be found using Monte Carlo integration.
Units=nats/symbol; change of base to get in bits.
2007
4/12/2007
7/57
ound
apacity B
BPSK C
Cap
acit
yB
Sha
nno
n
Code Rate r
Spectral Efficiency
oun
d
1.0
0.5
-2
-1
It is theoretically
possible to operate
in this region.
10
Eb/No in dB
Spring 2007
Communication Theory
4/12/2007
Power Efficiency of
Standard Binary Channel Codes
ound
apacity B
BPSK C
Uncoded
BPSK
Cap
acit
yB
Iridium
1998
Sha
nno
n
Code Rate r
Spectral Efficiency
oun
d
1.0
Pioneer
1968-72
Turbo Code
1993
LDPC Code
2001
Chung, Forney,
Richardson, Urbanke
0.5
Galileo:LGA
1996
IS-95
1991
Voyager
1977
Odenwalder
Convolutional
Codes 1976
Galileo:BVD
1992
Mariner
1969
-2
-1
arbitrarily low
BER: Pb = 10 5
8
10
Eb/No in dB
Constrained Capacity of
Higher-Order Modulation
7
M=64
M=32
Capacity
M=16
4
8PSK
3
QPSK
2
BPSK
1
0
-20
Spring 2007
-10
10
Es/No in dB
20
30
40
Communication Theory
4/12/2007
M=64
M=32
M=16
Capacity
8PSK
3
QPSK
2
BPSK
0
-5
10
15
Eb/No in dB
20
25
30
4/12/2007
Spring 2007
12/57
Communication Theory
4/12/2007
Find x1x2T
LM
MM
N
OP
PP
Q
1 1 0
[1 0 1] 0 0 1 =
0 1 1
2007
4/12/2007
13/57
Linear Codes
Linear code
4/12/2007
Spring 2007
2007
14/57
Communication Theory
4/12/2007
Generator Matrices
Generator matrix.
Dimensionality k by n
Spans the code space C
Rank k (linearly independent rows)
Rows of G form the basis for the code
2007
4/12/2007
15/57
LM1
G = M0
MN0
H=
d min =
Spring 2007
OP
P
1PQ
0 0 1 1 1 0
1 0 0 1 1 1
0 1 1 1 0
input bits
output code word
x1 x2 x3 c1 c2 c3 c4 c5 c6 c7
0 0 0
0 0 1
0 1 0
0 1 1
1 0 0
1 0 1
1 1 0
1 1 1
Communication Theory
4/12/2007
GHT = 0
2007
4/12/2007
17/57
4/12/2007
Spring 2007
2007
18/57
Communication Theory
4/12/2007
e is an error vector.
yHT=0 only if e = 0 or is another valid code word.
Thus all error patterns can be detected except for error patterns
that happen to be valid code words.
Repeat for all other bit positions until the position of the error is
determined.
4/12/2007
19/57
Systematic Codes
Systematic codes
4/12/2007
Spring 2007
20/57
10
Communication Theory
4/12/2007
Hamming Distance
Hamming weight
d H c1 , c 2 = c1,i + c2 ,i
Example:
c1 = (0,0,1,1,1,0,1)
c2 = (0,1,0,0,1,1,1)
i =1
2007
4/12/2007
21/57
Minimum Distance
o d
d min = min d H ci , c j
i j
it
4/12/2007
Spring 2007
2007
22/57
11
Communication Theory
4/12/2007
Importance of
Minimum Distance
4/12/2007
23/57
Spring 2007
min
PP
Q
In general,
4/12/2007
MM d
N
24/57
12
Communication Theory
4/12/2007
Repetition
Hamming: Richard Hamming, Bell Labs, 1946
Golay: Marcel Golay, 1949
Reed Muller: Reed & Muller, 1954
CRC: Prange, 1957
BCH: Bose, Ray-Chaudhuri, Hocquenghem, 1959-60
Nonbinary codes
2007
4/12/2007
25/57
Repetition Codes
Example (n=7):
0 0000000
1 1111111
4/12/2007
Spring 2007
k=1
n = n (usually odd)
r = 1/n
dmin = n
t = (n-1)/2
2007
26/57
13
Communication Theory
4/12/2007
Hamming Codes
n = 2m 1
k = 2m 1 m
Shortened: dmin = 4
Our example was a shortened (7,3) Hamming code.
2007
4/12/2007
27/57
Golay Codes
4/12/2007
Spring 2007
n = 23, k = 12
r = 12/23 0.52
dmin = 7, t = 3
Every received code word lies within distance t = 3 of
exactly one Golay code word.
Therefore, when the number of code bit errors is
greater than 3, the Golay code will always be
incorrectly decoded.
The only other known perfect codes are Hamming
codes and odd length repetition codes.
2007
28/57
14
Communication Theory
4/12/2007
n = 2m
k = m+1
dmin = 2m-1
t = (2m-1 -1)/2
Examples:
4/12/2007
29/57
CRC Codes
4/12/2007
Spring 2007
30/57
15
Communication Theory
4/12/2007
BCH Codes
BCH: Bose-Chaudhuri-Hocquenghem
n = 2m 1
k = any value
There will exist some t such that
t
nk
m
dmin = 2t + 1
2007
4/12/2007
31/57
M = 2m
i.e. each symbol represents m bits
m=8 is common (a symbol is then a byte)
4/12/2007
Spring 2007
n = 2m 1
k = any value < n
dmin = n - k + 1
t = (n-k)/2
They have the largest possible dmin for any code with the same
values of n and k.
2007
32/57
16
Communication Theory
4/12/2007
Let m = 8 and t = 4
(n,k) = (255, 247)
M = 28 = 256
4/12/2007
33/57
Applications of RS Codes
4/12/2007
Spring 2007
2007
34/57
17
Communication Theory
4/12/2007
rate r = k/n
block encoder
n code bits
Modulator
s(t)
+
estimates of
data bits
block
decoder
estimates of
code bits
MAP
Detector
AWGN
n(t)
r(t)
hard decision
decoding
2007
4/12/2007
35/57
Error Probability:
Hard Decision Decoding
Assumptions:
F nI
GH i JK p (1 p)
n
i = t +1
equality for
perfect codes
(Golay, Hamming)
4/12/2007
Spring 2007
1
i =0
n i
FG nIJ p (1 p)
HiK
i
2007
n i
36/57
18
Communication Theory
4/12/2007
Example:
Performance of (7,3) Code
Since dmin = 4, t = 1
Pc
7 i
i =2
7 i
i =0
7!
7!
(0.99) 7
(0.01)(0.99) 6
1! 6!
(0!)(7 !)
1 (0.99) 7 7(0.01)(0.99) 6
0.00203
2007
4/12/2007
37/57
4/12/2007
Spring 2007
2007
38/57
19
Communication Theory
4/12/2007
I
JK
For our coded system:
F 2rE I = QF
p = QG
H N JK GH
F
GH
2Eb
No
Pb = Q
6Eb
7 No
I
JK
Pc 1
i =0
FG nIJ p (1 p)
HiK
i
n i
1 (1 p) 7 7 p(1 p) 6
1 (1 p) 6 (1 + 6 p)
F F
1 G 1 QG
H H
6Eb
7 No
I I F1 + 6QF
JK JK GH GH
6
6Eb
7 No
II
JK JK
F
GH
p=Q
I F
JK GH
2rEb
=Q
No
I F
JK GH
2( 45)Eb
=Q
63 No
10Eb
7 No
I
JK
FG nIJ p (1 p)
HiK
F 63I
1 G J p (1 p)
HiK
t
Pc 1
n i
i =0
3
63i
i =0
Spring 2007
2007
40/57
20
Communication Theory
4/12/2007
F 23IJ p (1 p)
i K
F 23I
= 1 G J p (1 p )
HiK
Pc =
GH
23
Bound is exact
because Golay code is perfect
23 i
i =4
23 i
i =0
Where:
F 2 FG 12 IJ E I
G H 23 K JJ
p = QG
GG N JJ
K
H
b
2007
4/12/2007
41/57
Let (i) be the average number of bit errors when i code bits are in
error.
n
(i ) n i
Then:
Pb
p (1 p) n i
i
i = t +1 k
FG IJ
HK
4/12/2007
Spring 2007
(t + 1)
k
Pc
2007
42/57
21
Communication Theory
4/12/2007
ad
f(d)
253
924
3.65
506
2112
4.17
11
1288
7392
5.74
12
1288
8064
6.26
15
506
3960
7.83
16
253
2112
8.35
23
12
12
Most common
error at high SNR
Pb
253 / 924
Pc
12
2007
4/12/2007
43/57
Coding Gain
4/12/2007
Spring 2007
2007
44/57
22
Communication Theory
4/12/2007
BER
10
10
-2
capacity
curve is at
Eb/No = (22R-1)/(2R)
= 0.07 dB
-4
uncoded
BPSK
2.1 dB
coding
gain
-6
(23,12)
Golay
code
10
-8
5
6
Eb/No (in dB)
10
4/12/2007
Spring 2007
2007
46/57
23
Communication Theory
4/12/2007
c$
Tb
r(t)
dt
block
decoder
x$
1(t)
This is where the hard decision is made:
1 for r > 0
c$ =
0 for r 0
RS
T
Information is lost!
Essentially a 1-bit (2 level) quantizer
2007
4/12/2007
47/57
Tb
r(t)
dt
p bit
quantizer
rQ
block
decoder
x$
f1(t)
4/12/2007
Spring 2007
48/57
24
Communication Theory
4/12/2007
Equivalent to letting p
Tb
r(t)
dt
block
decoder
x$
f1(t)
4/12/2007
2007
49/57
Comments on
Soft Decision Decoding
4/12/2007
Spring 2007
2007
50/57
25
Communication Theory
4/12/2007
Performance of
Soft Decision Decoding
F
I
h GH dcc2 N, c hJK
h c
P2 c i , c j = P c$ = c j | c = c i = Q
d ci , c j = 2 Eb rd H ci , c j
2007
4/12/2007
51/57
Performance of
Soft Decision Decoding
2k
2k
1
2k
2k
i =1
2k
F
H
ad Q
d = d m in
a d m in Q
Spring 2007
2 b rw j
F
GH
No
F
GH
i IJ
JK
Assume a linear code:
2 b rd H c i , c j
j =1
ji
P2 c i , c j
j =1
ji
QG
j =1
w j 0
d
F
Q GG
H
i =1
2k
1
2k
No
I
JK
2 b rd
No
2 b rd m in
No
I
a is the number of code words of
JK
weight w = d
I For high SNR, performance is dominated
JK by the code words of weight w = d
d
min
26
Communication Theory
4/12/2007
d = d m in
Pb
m in
F
GH
F
GH
2 b rd
No
2 b rd m in
No
I
JK
I
JK
2007
4/12/2007
53/57
Weight Distribution
4/12/2007
Spring 2007
ad
f(d)
253
924
3.65
506
2112
4.17
11
1288
7392
5.74
12
1288
8064
6.26
15
506
3960
7.83
16
253
2112
8.35
23
12
12
2007
54/57
27
Communication Theory
4/12/2007
F 2 E F 12 I d I
G H 23 K JJ
P a QG
GG N JJ
H
K
F 2 E F 12 I 7 I
F 2 E F 12 I 8 I
F 2 E F 12 I 11 I
G
J
G
J
G H 23 K JJ
H
K
H
K
23
23
253Q G
+ 506 Q G
+ 1288Q G
J
J
N
N
GG
JJ
GG
JJ
GG N JJ
H
K
H
K
H
K
F 2 E F 12 I 12 I
F 2 E F 12 I 15 I
F 2 E F 12 I 16 I F
G
J
G
J
G H 23 K JJ + Q GG
H
K
H
K
23
23
+ 1288Q G
+ 506 Q G
+ 253Q G
J
J
N
N
GG
JJ
GG
JJ
GG N JJ GG
H
H
H
K
K
K H
F 2 E F 12 I 7 I
G H 23 K JJ
253Q G
GG N JJ
H
K
23
d=7
2 Eb
F 12 I 23 IJ
H 23 K J
N
JJ
K
o
+
QG
QG
J
N
12
1
2
GG
JJ
GG N JJ
H
K
H
K
F 2E F 12 I11 I
F F 12 I I
H 2 3 K JJ + 8 0 6 4 Q GG 2 E H 2 3 K 1 2 JJ
7392 G
+
QG
N
N
12
GG
JJ 1 2 GG
JJ
H
K
H
K
F 2E F 12 I15 I
F 2E F 12 I16 I
F
H 2 3 K JJ + 2 1 1 2 Q GG H 2 3 K JJ + 1 2 Q GG
3960 G
+
QG
N
N
12
GG
JJ 1 2 GG
JJ 1 2 GG
H
K
H
K
H
F 2E F 12 I 7 I
G H 2 3 K JJ
77Q G
GGH N JJK
Pb
23
d=7
F
GG
GG
H
2 Eb
2 Eb
F 1 2 I 2 3 IJ
H 23 K J
N
JJ
K
o
Spring 2007
28
Communication Theory
4/12/2007
BER
10
10
10
-2
Soft
Decision
Decoding
Minimum Distance
Asymptote
-4
Uncoded
Soft decision
decoding
Union
Bound
-6
Hard decision
decoding
2 dB difference
at high Eb/No
10
Spring 2007
-8
5
Eb/No (in dB)
10
29