Anda di halaman 1dari 44

CONVOLUTIONAL

CODE

Modul 10 - Siskom 2 - Teori Informasi 1


Modul 11 - Siskom 2 - Convolutional Code
2

Convolutional codes
• Pada channel coding menggunakan kode konvolusi :
• Mengkodekan data stream ke dalam sebuah codeword.
• Tidak memerlukan pengubahan data stream kedalam suatu
blok yang ukurannya tetap.

• Perbedaan mendasar antara kode blok dan kode konvolusi


dalam mendesain dan mengevaluasi :
• Block codes didasarkan pada teknik algebra / kombinasi
• Convolutional codes didasarkan pada teknik konstruksi
Modul 11 - Siskom 2 - Convolutional Code
3

Convolutional codes - cont’d


• Kode konvolusi di spesifikasikan dengan 3 parameter
(n, k , K ) (k / n, K )
Rc  k / n
yaitu atau
• adalah coding rate (laju pengkodean),
menyatakan jumlah bit data per coded bit.
• Pada prakteknya dipilih k=1

• K menyatakan constraint length dari encoder dimana


encoder mempunyai K-1 elemen memori (shift register)
Block diagram of the DCS
Modul 11 - Siskom 2 - Convolutional Code
4

Information Rate 1/n


Modulator
source Conv. encoder

m  (m1 , m2 ,..., mi ,...) U  G(m)



 (U1 , U 2 , U 3 ,...,U i ,...)

Channel
Input sequence

Codeword sequence

U i  u1i ,...,u ji ,...,uni


 
Branch wo rd ( n coded bits)

Information Rate 1/n


Demodulator
sink Conv. decoder
ˆ  (mˆ 1 , mˆ 2 ,..., m
m ˆ i ,...) Z  ( Z1 , Z 2 , Z 3 ,..., Z i ,...)
 
received sequence

Zi  z1i ,...,z ji ,...,zni


   
Demodulator outputs n outputs per Branch word
for Branch word i
Modul 11 - Siskom 2 - Convolutional Code
5

Proses encoder
• Inisialisasi memori shift register sebelum proses encoding dengan mengisi semua
shift register dengan bit nol (all-zero)
• Tambahkan bit nol diakhir data bit setelah proses encoding agar semua isi shift
register menjadi nol. Bit nol yang ditambahkan ini disebut tail-zero bit atau bit tail

data tail Encoder codeword

• Laju pengkodean efektif :


• L adalah jumlah data bit dan diasumsikan k=1 :

L
Reff   Rc
n( L  K  1)
Modul 11 - Siskom 2 - Convolutional Code
6

Encoder•Konvolusi dengan rate ½


Convolutional encoder (rate ½, K=3)
• 2 shift-register dimana isi register yang pertama
merupakan data input
• k = 1 dan n = 2
• Contoh Implementasi dengan g1 = (1 1 1) dan g2
= (1 0 1)

g1 = (1 1 1)
u1 First coded bit

(Branch word)
Input data bits Output coded bits
m u1 ,u2
u2 Second coded bit
g2 = (1 0 1)
Modul 11 - Siskom 2 - Convolutional Code
7
Contoh encoder konvolusi dengan rate = 1/2

Misal : Data Input m = (1 0 1)

Output Output
(Branch word) (Branch word)
u1 u1
Time Time
u1 u2 u1 u2
t1 1 0 0 t2 0 1 0
1 1 1 0
u2 u2

u1 u1
u1 u2 u1 u2
t3 1 0 1 t4 0 1 0
0 0 1 0
u2 u2
Modul 11 - Siskom 2 - Convolutional Code
8
Contoh encoder konvolusi rate = ½ …..cont

Time Output Time Output


(Branch word) (Branch word)

u1 u1
u1 u2 u1 u2
t5 0 0 1 t6 0 0 0
1 1 0 0
u2 u2

m  (101) Encoder U  (11 10 00 10 11)


Modul 11 - Siskom 2 - Convolutional Code
9

Convolutional
Contoh Encoder:
ImplementasiExample
dengan g1 = (1 0 1) dan g2 = (1 1 1)
Rate ½ Convolutional Encoder
g2 = (1 1 1) 1
+
u2

Input Output
101 0 0
u1

+
g1 = (1 0 1)
1
Modul 11 - Siskom 2 - Convolutional Code
10

Convolutional
Contoh Encoder:
ImplementasiExample
dengan g1 = (1 0 1) dan g2 = (1 1 1)
Rate ½ Convolutional Encoder
g2 = (1 1 1)
1
+
u2

Input Output
101 1 0
11
u1

+
g1 = (1 0 1) 0
Modul 11 - Siskom 2 - Convolutional Code
11
Convolutional Encoder: Example
Contoh Implementasi dengan g1 = (1 0 1) dan g2 = (1 1 1)
Rate ½ Convolutional Encoder
g2 = (1 1 1)
0
+
u2

Input Output
101 0 1
1011
u1

+
g1 = (1 0 1) 0
Modul 11 - Siskom 2 - Convolutional Code
12

Convolutional
Contoh Encoder:
ImplementasiExample
dengan g1 = (1 0 1) dan g2 = (1 1 1)
Rate ½ Convolutional Encoder
g2 = (1 1 1)

+
u2

Input Output
101 1 0
001011
u1

+
g1 = (1 0 1)
Modul 11 - Siskom 2 - Convolutional Code
13

Diagram State
• Menyatakan perubahan isi / kondisi (state) dari memori shift register
• Isi memory shift register selanjutnya akan diprediksi berdasarkan data
input yang masuk dan isi memory saat ini
• Terdapat state
2 K 1

• Diagram state juga berisi semua state dan semua kemungkinan


transisi antar state
Modul 11 - Siskom 2 - Convolutional Code
14
State Diagram Representation of
Convolutional Codes
input/output(u2,u1)

g1 = (1 1 1) 0/00
+
00
Input Output
a
SR0 SR1 1/11
0/11

01 1/00 10
+
c b
g2 = (1 0 1)
0/01
0/10 1/10
11
States (SR0 SR1) d
a 0 0
b 1 0
c 0 1 1/01
Input 0
d 1 1 Input 1
Modul 11 - Siskom 2 - Convolutional Code
15
Trellis Representation of 0/00

Convolutional Codes 0/11


a
1/11
c 1/00 b

0/01
Input 0 0/10 1/10
d
Input 1
Input 0
1/01 Input 1

00 00 00 00 00 00
a (0 0)
11 11 11 11 11 11

b (1 0) 11 11 11 11
00 00 00 00

01 01 01 01 01
c (0 1) 10 10 10 10 10

10 10 10 10

d(1 1) 01 01 01 01
Modul 11 - Siskom 2 - Convolutional Code
16
Trellis Representation of 0/00

Convolutional Codes 0/11


a
1/11
c 1/00 b

0/01
0/10 1/10
Input: 101 Output: 110100 d

Input 0
1/01 Input 1

00 00 00 00 00 00
a (0 0)
11 11 11 11 11 11

b (1 0) 11 11 11 11
00 00 00 00

01 01 01 01 01
c (0 1) 10 10 10 10 10

10 10 10 10

d (1 1) 01 01 01 01
Modul 11 - Siskom 2 - Convolutional Code
17

Representasi Encoder Menggunakan Vektor

• Didefinisikan vektor untuk menyatakan encoder konvolusi.


• Vektor terdiri dari K element dimana masing – masing vektor menyatakan
penjumlahan modulo-2.
• Elemen ke – i dari masing – masing vektor = “1” jika isi register ke - i tersebut
terhubung ke penjumlahan modulo-2. Sedangkan elemen ke – i = “0” jika tidak
terhubung ke penjumlahan modulo-2
• Contoh :
u1
g1  (111)
m u1 u2
g 2  (101)
u2
Modul 11 - Siskom 2 - Convolutional Code
18
Representasi Encoder Menggunakan Vektor – cont’d

• Representasi terhadap respon


impuls
• Respon encoder terhadap sebuah bit “1” Branch word
Register
• Contoh : contents u1 u2
Input sequence: 1 0 0 100 1 1
Output sequence: 11 10 11 010 1 0
001 1 1

Input m Output
1 11 10 11
0 00 00 00
1 11 10 11
Modulo-2 sum: 11 10 00 10 11
Modul 11 - Siskom 2 - Convolutional Code
19
Representasi Encoder Menggunakan Polynomial

• Didefinisikan n buah generator polynomial yang masing –


masing menyatakan penjumlahan modulo-2.
• Masing –masing polynomial mempunyai derajat K-1 atau
kurang
• Masing – masing polynomial juga menyatakan hubungan
shift register yang berhubungan dengan penjumlahan
modulo - 2
• Contoh :
g1 ( X )  g 0(1)  g1(1) . X  g 2(1) . X 2  1  X  X 2
g 2 ( X )  g 0( 2 )  g1( 2 ) . X  g 2( 2 ) . X 2  1  X 2

Output encoder menjadi :

U1(X) = m(X) g1(X) dan U2(X) = m(X) g2(X)


Modul 11 - Siskom 2 - Convolutional Code
20

Representasi Polynomial –cont’d


Untuk lebih detail :
m( X )g1 ( X )  (1  X 2 )(1  X  X 2 )  1  X  X 3  X 4
m( X )g 2 ( X )  (1  X 2 )(1  X 2 )  1  X 4
m ( X ) g 1 ( X )  1  X  0. X 2  X 3  X 4
m( X )g 2 ( X )  1  0. X  0. X 2  0. X 3  X 4
U( X )  (1,1)  (1,0) X  (0,0) X 2  (1,0) X 3  (1,1) X 4
U  11 10 00 10 11
Modul 11 - Siskom 2 - Convolutional Code
21

Optimum decoding
• If the input sequence messages are equally likely, the
optimum decoder which minimizes the probability of error is
the Maximum likelihood decoder.

• ML decoder, selects a codeword among all the possible


codewords which maximizes the likelihood function
where (m ) is the received sequence and is
p ( Z | U )
one of the possible codewords:(m) Z
U
2 L codewords
to search!!!
ML decoding rule:
Choose U ( m) if p(Z | U ( m) )  max (m) p(Z | U ( m ) )
over all U
Modul 11 - Siskom 2 - Convolutional Code
22

Soft and hard decisions


• In hard decision:
• The demodulator makes a firm or hard decision whether one or zero is
transmitted and provides no other information for the decoder such that how
reliable the decision is.

• Hence, its output is only zero or one (the output is quantized only to two
level) which are called “hard-bits”.
• Decoding based on hard-bits is called the “hard-decision decoding”.
Modul 11 - Siskom 2 - Convolutional Code
23

Soft and hard decision-cont’d


• In Soft decision:
• The demodulator provides the decoder with some
side information together with the decision.
• The side information provides the decoder with a
measure of confidence for the decision.
• The demodulator outputs which are called soft-
bits, are quantized to more than two levels.
• Decoding based on soft-bits, is called the
“soft-decision decoding”.
• On AWGN channels, 2 dB and on fading
channels 6 dB gain are obtained by using soft-
decoding over hard-decoding.
Modul 11 - Siskom 2 - Convolutional Code
24

The Viterbi algorithm


• The Viterbi algorithm performs Maximum likelihood
decoding.
• It find a path through trellis with the largest metric
(maximum correlation or minimum distance).
• It processes the demodulator outputs in an iterative
manner.
• At each step in the trellis, it compares the metric of all
paths entering each state, and keeps only the path with
the largest metric, called the survivor, together with its
metric.
• It proceeds in the trellis by eliminating the least likely
paths.
• It reduces the decoding complexity to K 1
!
L2
Modul 11 - Siskom 2 - Convolutional Code
25

The Viterbi algorithm - cont’d


• Viterbi algorithm:
A. Do the following set up:
• For a data block of L bits, form the trellis. The trellis
has L+K-1 sections or levels and starts at time t1 and
ends up att time .
L K
• Label all the branches in the trellis with their
corresponding branch metric.
• For each state in the trellis at the time which is
denoted by t
, definei a parameter
S (ti ), ti 
K 1
S (ti ) {0,1,...,2 }
B. Then, do the following:
Modul 11 - Siskom 2 - Convolutional Code
26

The Viterbi(algorithm
0, t )  0 i  2. - cont’d1

ti
1. Set and
2. At time , compute the partial path metrics for all the paths entering each
S (ti ), ti 
state. ti
3. Set equal to the best partial path metric entering each state at
time .
Keep the survivor path and delete the dead paths from the trellis.
i  LK i
4. If , increase by 1 and return to step 2.
t L K
C. Start at state zero at time . Follow the surviving branches
backwards through the trellis. The path thus defined is unique and
correspond to the ML codeword.
Modul 11 - Siskom 2 - Convolutional Code
27
Maximum Likelihood Decoding of
Convolutional Codes
Maximum Likelihood Decoding:
What is the transmitted sequence that will most likely result in the
received sequence at the decoder side?
Viterbi Decoding of Convolutional Codes:
Maximum likelihood decoding algorithm

An algorithm that finds the closest codeword to a given received sequence


Hard Decision: Closest  Minimum Hamming Distance
Soft Decision : Closest  Minimum Voltage Separation

Hard Decision:
•The receiver makes a firm hard decision whether one or zero is
received
•The receiver provides no information to the decoder characterizing
reliability of its decision
•The input to the decoder is only zero or one
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
28

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t
00 1 00 00 00 00 00
a (0 0)
11 11 11 11 11 11
1
b (1 0) 11 11 11 11
00 00 00 00

01 01 01 01 01
c (0 1) 10 10 10 10 10

10 10 10 10

d (1 1) 01 01 01 01
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
29

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t
00 1 00 3 00 00 00 00
a (0 0)
11 11 11 11 11 11
1 1
b (1 0) 11 11 11 11
00 00 00 00

01 01 01 01 01
c (0 1) 10 10 10 10 10
2
10 10 10 10

d (1 1) 01 01 01 01
2
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
30

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t 4
1 3 3
00 00 00 00 00 00
a (0 0)
11 11 11 11 11 11
1 4
1
3
b (1 0) 11 11 11 11
00 00 00 00

01 01 1 01 01 01
c (0 1) 10 10 4 10 10 10
2
10 10 10 10

d (1 1) 01 01 01 01
2 3
2
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
31

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t
1 3 3
00 00 00 00 00
a (0 0)
11 11 11 11 11
1 1
3
b (1 0) 11 11 11 11
00 00 00 00

01 01 1 01 01 01
c (0 1) 10 10 10 10
2
10 10 10

d (1 1) 01 01 01 01
2 2
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
32

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t 3
1 3 3 3
00 00 00 00 00
a (0 0)
11 11 11 11 11
1 5
1
3 1
b (1 0) 11 11 11 11
00 00 00 00

01 01 1 01 4 01 01
c (0 1) 10 10 3 10 10
2
10 10 10

d (1 1) 01 01 01 01
2 2 4
3
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
33

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01
Two Equivalent Paths:
Eliminate one of them at random
10 11 01 00 11 10
t 3
1 3 3 3
00 00 00 00 00
a (0 0)
11 11 11 11
1 1
3 1
b (1 0) 11 11 11 11
00 00 00 00

01 01 1 01 01
c (0 1) 10 3 10 10
2
10 10 10

d (1 1) 01 01 01 01
2 2 3
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
34

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t
1 3 3 3
00 00 00 00 00
a (0 0)
11 11 11 11
1 1
3 1
b (1 0) 11 11 11
00 00 00 00

01 01 1 01 01
c (0 1) 10 3 10 10
2
10 10 10

d (1 1) 01 01 01 01
2 2 3
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
35

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t 5
1 3 3 3 3
00 00 00 00 00
a (0 0)
11 11 11 11
1 3
1
3 1 5
b (1 0) 11 11 11
00 00 00 00

01 01 1 01 2 01
c (0 1) 10 3 10 4 10
2
10 10 10

d (1 1) 01 01 01 01
2 2 3 2
4
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
36

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t
1 3 3 3 3
00 00 00 00
a (0 0)
11 11 11 11
1 3
1
3 1
b (1 0) 11 11 11
00 00 00

01 01 1 01 2 01
c (0 1) 10 3 10 10
2
10 10

d (1 1) 01 01 01
2 2 3 2
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
37

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t 4
1 3 3 3 3 3
00 00 00 00
a (0 0)
11 11 11 11
1 3 4
1
3 1 3
b (1 0) 11 11 11
00 00 00

01 01 1 01 2 01 5
c (0 1) 10 3 10 10 2
2
10 10

d (1 1) 01 01 01
2 2 3 2 3
4
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
38

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01

10 11 01 00 11 10
t
1 3 3 3 3 3
00 00 00
a (0 0)
11 11 11
1 3
1
3 1 3
b (1 0) 11 11 11
00 00 00

01 01 1 01 2
c (0 1) 10 3 10 10 2
2
10 10

d (1 1) 01 01
2 2 3 2 3
Viterbi Decoder Hard
Modul 11 - Siskom 2 - Convolutional Code
39

Decision
t
Assume received (Hard Decision) vector is 01 11 00 10 11 01
Message (0 1 1 0 1 0) Decoded vector is 01 01 00 10 11 00
10 11 01 00 11 10
t
1 3 3 3 3 3
00 00 00
a (0 0)
11 11 11
1 3
1
3 1 3
b (1 0) 11 11 11
00 00 00

01 01 1 01 2
c (0 1) 10 3 10 10 2
2
10 10

d (1 1) 01 01
2 2 3 2 3
Modul 11 - Siskom 2 - Convolutional Code
40
Latihan soal: Hard decision Viterbi
decoding
m  (10100) U  (11 10 00 10 11)
Z  (11 10 11 10 01)

a (0 0) 0/00 0/00 0/00 0/00 0/00


1/11 1/11 1/11
b (1 0) 0/11 0/11 0/11
0/10 1/00

c (0 1) 0/10 0/10
1/01 1/01
0/01 0/01
d(1 1)
t1 t2 t3 t4 t5 t6

Tentukan data kirim ?


Apakah terjadi error data ?
Modul 11 - Siskom 2 - Convolutional Code
41

Free distance of Convolutional codes


• Distance properties:
• Since a Convolutional encoder generates codewords with
various sizes (as opposite to the block codes), the following
approach is used to find the minimum distance between all
pairs of codewords:
• Since the code is linear, the minimum distance of the code is
the minimum distance between each of the codewords and the
all-zero codeword.
• This is the minimum distance in the set of all arbitrary long
paths along the trellis that diverge and remerge to the all-zero
path.
• It is called the minimum free distance or the free distance of
the code, denoted by
d free or d f
Modul 11 - Siskom 2 - Convolutional Code
42

Free distance …
The path diverging and remerging to Hamming weight
All-zero path
all-zero path with minimum weight of the branch
df 5
0 0 0 0 0
2 2 2
2 2 2
1 0
1 1
1 1
1 1

t1 t2 t3 t4 t5 t6
Hard Decision Vs Soft Decision Performance:
Example 1/3 Repetition Code
-1
10
Soft Decision Decoding
Hard Decision Decoding
-2
10

Coded Bit Error Probability


-3
10

-4
10

-5
10

-6
10

-7
10
0 1 2 3 4 5 6 7 8 9
2 2
SNR = 10*log
(dB) 10(a / ) (dB)
Modul 11 - Siskom 2 - Convolutional Code
44

Tugas, Dikumpulkan !

Exercise 2

a) Find the maximum likehood path through the trellis diagram !


b) Determine the first 5 decode data bits !

Anda mungkin juga menyukai