Anda di halaman 1dari 4

Assignments (All Unit)

Subject: DCDR (2161603)


* ḇ need to be considered as blank space
# Assume /b as blank space

1. Define Following. 14
1. Compression Ratio. 2. Vector Quantization. 3. Entropy. 4. LZ77. 5. Huffman Code.
6. HINT. 7. Data Retrieval. 8. LZW. 9. Query Optimization. 10. Distortion. 11. Self-
Information. 12. Compare Lossless and Lossy Data Compression. 13. Run Length
Coding. 14. Rice Code.

Unit 1: Compression Techniques


1. Compare Lossless Compression with Lossy Compression. 03
2. Explain types of data compression. List out applications of data retrieval. 07
3. How to measure the performance of multiple Data Compression algorithms?
Explain parameters to select one algorithm out of many. 07
4. Differentiate following:
(i). Lossy Compression vs. Lossless Compression
(ii). Statistical vs. Dictionary based compression. 04
5. Explain modeling and coding. Explain how this will help to reduce entropy for
following data. 9,11,11,11,14,13,15,17,16,17,20,21. 07 // Explain modeling and
coding. Explain how this will help to reduce entropy with suitable example. 07 // How
Modeling and Coding are useful for the development of Data Compression algorithm.
04

Unit 2: Mathematical Preliminaries for Lossless Compression Models


1. Explain Different types of models in data compression. 04
2. Write a short note on uniquely decodable codes. 04
3. Define following terms: 03
(i) Uniquely Decodable Code (ii) Prefix Code (iii) Instantaneous Code
4. Write a short note on Prefix Code. 03
5. Define: - Compression Ratio, Entropy, Distortion, Data Retrieval, Query Optimization,
HINT and Run Length Coding. 07
6. Explain Markov model with example. 07

Unit 3: Huffman coding


1. An alphabet S ={ a1, a2, a3 ,a4 ,a5} symbols with probabilities as P(a1) = 0.4, P(a2) =
0.3, P(a3)=0.2, P(a4)=0.09, and P(a5)=0.01, Find out Huffman code, source entropy,
average length and compression ratio. 04
2. Explain Huffman Coding with respect to minimum variance Huffman codes with
separate trees. 07
3. Explain the Encoding process of Adaptive Huffman Algorithm. 07
4. How Extended Huffman reduces code average length Code? Prove using alphabet A=
{a1, a2, a3} with probability 0.95, 0.03 and 0.02 respectively. 07
5. Consider a source containing 26 distinct symbols [A-Z]. Encode given sequence of
symbols using Adaptive Huffman algorithm. Symbol Sequence: MUMMY. 07
6. Design a minimum variance Huffman code for a source that put out letter from an
alphabet A={ a1, a2, a3, a4, a5, a6} with P(a1)=P(a2)=0.2, P(a3)=0.25, P(a4)=0.05,
P(a5)=0.15,P(a6)=0.15.
Find the entropy of the source, avg. length of the code and efficiency.
Also comment on the difference between Huffman code and minimum variance
Huffman code.07
7. Explain Huffman Coding in detail with example. Define minimum variance Huffman
codes. 07 // Explain Huffman Coding with suitable example. 07
8. Encode “aacdeaab” using Adaptive Huffman code. Derive Output string, Codes and
final tree. 07
9. Encode “acadebaa” using Adaptive Huffman code. Derive, Codes and final tree. 07
10. Consider a source emits letter from an alphabet A={a1,a2,a3,a4} with probability
P(a1)=0.3,P(a2)=0.2,P(a3)=0.35,P(a4)=0.15.
[I] Find a Huffman code using minimum variance procedure.
[II] Find average length of the code. 07
11. Write a procedure to generate Adaptive Huffman Code. 03
12. Generate GOLOMB code for m=9 and n=8 to 13. 07
13. Generate GOLOMB code for m=5 and n=4 to 10. 04 // Generate GOLMB code for m=5
and n=0 to 10. 04
14. Write procedure to generate TUNSTALL code. Generate TUNSTALL code with
probability of P(A)=0.6, P(B)=0.3, P(C)=0.1 and n=3 bits. 07 // Write a short note on
Tunstall Code. 04
15. Explain Tunstall Codes with example. 07
16. Generate TUNSTALL code P(A)=0.4, P(B)=0.3, P(C)=0.3 and n=3 bits. 04
17. Explain Rice Codes in brief. 03
18. Write a different Application of Huffman Coding. 03 // Write a different Application
of Huffman Coding. 07

Unit 4: Arithmetic Coding


1. Given source with probabilities of symbols as P(A)=0.45 P(B)=0.25, P(C)=0.15,
P(D)=0.15. Perform encoding of string "BCADB" using arithmetic coding and
generate tag. 04
2. Using given probabilities P(A)=0.2, P(B)=0.2, P(C)=0.2, P(D)=0.4. Decode tag
0.14496 for at least five symbols.04
3. Define Arithmetic Coding. Encode and Decode “BACBA” with arithmetic coding.
(P(A)=0.5,P(B)=0.3,P(C)=0.2) 07
4. Encode and Decode “AABBC” with arithmetic coding. (P(A) = 0.6, P(B) = 0.3, P(C) =
0.1. 07
5. Encode the following sequence using Diagram Coding of Static Dictionary method
(Generate for 3 bit): abracadabra. 07
6. Encode and decode “BACBA” with Arithmetic Coding. [ P(A) = 0.5, P(B) = 0.3, P(C)
= 0.2 ] 07
7. Write the method to generate a tag in arithmetic coding. 07
8. Write an encoding algorithm for arithmetic coding. 07
9. Compare Arithmetic Coding and Huffman Coding Algorithms for text compression. 03

Unit 5: Dictionary Techniques


1. Given an initial dictionary Index 1=w, 2=a, 3=b, encode the following message using
the LZ78 algorithm: wabba/bwabba/bwabba/bwabba/bwoo/bwoo/bwoo. # 07
2. Explain process generating triple in all three possible cases of LZ77 algorithm. 03
3. Compare & contrast: 04
(i) LZ78 and LZW Algorithms.
(ii) Static Dictionary Based Algorithm vs. Dynamic Dictionary Based
Algorithm
4. Encode and Decode following sequence using LZW Coding technique. Sequence:
ABABABAB. 07
5. Explain LZ78 encoding procedure with suitable example. 07
6. Given an initial dictionary consisting of the letters a b r y ḇ, encode the following
message using the LZW algorithm: aḇbarḇarrayḇbyḇbarrayarḇbay. 07
7. Encode the following sequence using the LZ77 and LZ78 algorithm:
ḇarrayarḇbarḇbyḇbarrayarḇba. Assume you have a window size of 30 with a look-
ahead buffer of size 15. Furthermore assume that C(a)=1, C(b) = 2, C(ḇ) = 3, C(r) = 4,
and C(y) = 5.* 07
8. List out the different techniques for Lossless Compression and explain LZ77 with
example. 07
9. List out the different techniques for Lossless Compression and explain LZW with
example. 07
10. A sequence is encoded using the LZ77 algorithm. Given that C(a) = 1, C(b) = 2, C(r) =
3, and C(t)= 4, decode the following sequence of triples:
<0, 0, 3>,< 0, 0, 1>,<0, 0, 4>,< 2, 8, 2>,< 3, 1, 2>,<0, 0, 3>,<6, 4, 4>,<9,5, 4>.
Assume that the size of the window is 20 and the size of the look-ahead buffer is 10.
Encode the decoded sequence and make sure you get the same sequence of triples. 07
11. Explain LZ78 encoding procedure. 07
12. Explain LZW method with example. 07

Unit 6: Predictive Coding


1. Explain significance of discrete cosine transform (DCT) in JPEG Compression .03
2. Draw and Explain Block diagram for Baseline JPEG Algorithm. 07
3. Explain JPEG. 07
4. What is significance of Quantization and Zigzag Coding in JPEG Compression? 04
5. Write a short note on Old JPEG standard and JPEG-LS. 07
6. Explain CALIC. 03 // Explain CALIC. 07
7. Explain OLD JPEG Standard. 03
8. Explain prediction with partial match in short. 04 // Explain Prediction with Partial
Match in detail. 07
9. Encode the sequence this/bis/bthe using Burrows-Wheeler transform and move to front
coding. 07
10. Explain Burrows-Wheeler transform with example. 04
11. Encode the sequence etaḇcetaḇandḇbetaḇceta using Burrows-Wheeler transform and
move to front coding.* 07

Unit 7: Mathematical Preliminaries for Lossy Coding (Scalar Quantization)


1. List out different types of quantizer. Explain quantization problem with example. 07
2. Explain Sampling and Quantization of an Audio Signal 03
3. Compare Uniform Quantization with Non Uniform Quantization. 03
4. Find the storage size of Gray scale video clip of 20 second duration with 640x480
resolution @ 30 FPS. 03
5. Explain Scalar Quantization in detail. 07 // Explain Scalar Quantization in brief. 04
6. Compare Uniform Quantization with Non Uniform Quantization. 03
7. Explain adaptive quantization with its two approaches. 07

Unit 8: Vector Quantization


1. Explain Vector Quantization in detail. 07 // Explain Vector Quantization in brief. 04
2. Explain how Vector Quantization is better than Scalar Quantization with example. 07
3. Write a short note on tree structure vector quantizer. 07
Unit 9: Boolean retrieval
1. Explain Lemmatization and Stemming in detail. 07
2. Explain skip pointers and Phrase queries with example. 07 // Write a short note on Skip
Pointer with example. 04 // Write a short note on Phrase queries with example. 04 //
Explain Skip Pointers in brief. 03
3. Explain Tokenization. 03 // Write a short note on Tokenization. 04
4. Explain Biword Indexes and Positional Indexes in brief. 03
5. Explain Data Retrieval in brief. 03 // Explain Information Retrieval in detail. 04
6. Explain the algorithm of intersecting two postings lists in data retrieval. 07
7. Explain and compare Incident matrix and Inverted index with example. 07
8. Write a short note on: I) Tokenization II) Stop words Removal. 07
9. Write a short note on: I) Positional Index II) data-centric XML retrieval. 07

Unit 10: XML retrieval


1. Explain Vector Space model for XML Retrieval. 04
2. Explain challenges in XML information retrieval. 07 // Discuss different challenges in
XML Retrieval. 03

By Prof. Minal Chauhan

Anda mungkin juga menyukai