1
1
log
M
total
i
i
i
I
H P
N P
=
| |
= =
|
\ .
Bits/Symbol
It means that in a long message we can expect H bits of information
per symbol. H is called entropy.
Information Rate
Information rate = total information/ time taken
Here, time taken
N bits are transmitted with r symbols per second. Total
information is nh.
Information rate
n
Tb
r
=
nH
R
n
r
R rH
=
| |
|
\ .
=
Bits/sec
Some Maths
H satisfies following Equation
2
0 log H M s s
Maximum H will occur when all the messages have equal probability.
Hence h also shows the uncertainty that which of the symbol will occur. As
h approaches to its maximum value we cant determine which message will
occur.
Consider a system which transmits only 2 messages having equal
probability of occurrence 0.5. Then, H=1. And at every instant we cant say
which one of the two message will occur. So what would happen if there are
more than two symbol sources?
Variation of H Vs. p
Lets Consider a Binary Source,
means M=2
Let the two symbols occur at the probability
p and
1-p Respectively.
Where o < p < 1.
So Entropy is
2 2
1 1
log (1 ) log
1
H p p
p p
| | | |
= +
| |
\ . \ .
( ) p = O
Horse Shoe Function
Variation of H Vs. P
( )
0
dH d p
dp dp
O
= =
2
2
1 1
0
1
d H
dp p p
= <
0.5 1 0
1
Now we want to obtain the shape of the curve
1
log 0
p
p
| |
=
|
\ .
Verify it by Double differentiation
Example
Maximum Information rate
R rH =
2
max log H M =
We Know that
Also
2
max log R r M =
Hence
Coding for Discrete memoryless Source
Here Discrete means The Source is emitting
different symbols that are fixed.
Memoryless = Occurrence of present symbol is
independent of previous symbol.
Average Code Length
1
i i
M
i
N pN
=
=
Where
Ni=Code length in Binary
digits (binits)
Coding for Discrete memoryless Source
1
b
R H
r
N
q = = s
Efficiency
Coding for Discrete memoryless Source
Krafts inequality
1
2 1
M
Ni
i
K
=
= s