Anda di halaman 1dari 20

Leonid Yaroslavsky

THEORETICAL
FOUNDATIONS
of
DIGITAL IMAGING
USING MATLAB

CRC Press
Preface

Work in digital imaging and its numerous applications has become a profession for

many tens of thousands of engineers and researchers. This book is intended as a text book for

studying theoretical foundations of digital imaging to master this profession. It is based on

the experience accumulated by the present author for many years of working in the field and

teaching various courses in digital image processing and digital holography in the Russian

Academy of Sciences, in the National Institutes of Health, Bethesda, Maryland, USA, in the

University of Erlangen-Nuremberg, Germany, in the Tampere University of Technology

(Tampere, Finland), in Agilent Labs., Palo Alta, California, USA, in the Autonomous

University of Barcelona, Barcelona, Spain, in the Institute of Henri Poincare, Paris, France, in

the Kiryu University, Kiryu, Japan, in the University of Tel Aviv.

The book is addressed to young folks who opted to pursue a scientific and research

carrier in imaging science and engineering. The most outstanding minds of the mankind, such

as Galileo Galilei, Ren Descartes, Isaac Newton, James Clerk Maxwell and many other

scientists and engineers contributed to this branch of the modern science and technology. At

least 12 Nobel Prizes have been awarded for contributions directly associated with image

science and imaging devices and majority of others would not be possible without one or

another imaging methods. You will be in a good company, dear reader. Let this book help

you to become a master of digital imaging.


THEORETICAL FOUNDATIONS OF DIGITAL IMAGING USING MATLAB

TABLE OF CONTENTS

Chapter 1. INTRODUCTION.

1.1 IMAGING GOES DIGITAL


1.2 BRIEFLY ABOUT THE BOOK STRUCTURE

Chapter 2. MATHEMATICAL PRELIMINARIES

2.1 MATHEMATICAL MODELS IN IMAGING


2.1.1 Primary definitions
2.1.2 Linear signal space, basis functions and signal representation as expansion over a set of basis
functions
2.2 SIGNAL TRANSFORMATIONS
2.3 IMAGING SYSTEMS AND INTEGRAL TRANSFORMS
2.3.1 Direct imaging and convolution integral
2.3.2 Multi resolution imaging: wavelet transforms
2.3.3 Imaging in transform domain and diffraction integrals
2.3.4 Properties of the integral fourier transform
2.3.5 Transforms in sliding window (windowed transforms) and signal sub-band decompositio
2.3.6 Imaging from projections and radon transform
2.4 STATISTICAL MODELS OF SIGNALS AND TRANSFORMATIONS
2.4.1 Principles of statistical treatment of signals and signal transformations and basic definitions
2.4.2 Models of signal random interferences
2.4.3 Speckle noise model
2.4.4 Quantifying signal processing quality
2.4.5 Basics of optimal statistical parameter estimation
APPENDIX
Derivation of Eq. 2.2.5
Derivation of Eq. 2.3.28
Derivations of Eqs. 2.3.47 2.3.50
REFERENCES
INDEX

Chapter 3. IMAGE DIGITIZATION

3.1 PRINCIPLES OF SIGNAL DIGITIZATION


3.2 SIGNAL DISCRETIZATION
3.2.1. Signal discretization as expansion over a set of basis functions.
3.2.2. Typical basis functions and classification
3.2.3. Optimality of bases. Karhunen-Loeve and related transform.
3.3 IMAGE SAMPLING
3.3.1 The sampling theorem and signal sampling
3.3.2 Sampling artifacts: quantitative analysis
3.3.3 Sampling artifacts: qualitative analysis
3.4 ATERNATIVE METHODS OF DISCRETIZATION IN IMAGING DEVICES
3.5 SIGNAL SCALAR QUANTIZATION
3.5.1. Optimal quantization: principles
3.5.2 Design of optimal quantizers
3.5.3 Quantization in digital holography
3.6 BASICS OF IMAGE DATA COMPRESSION
3.6.1 What is image data compression and why do we need it?
3.6.2 Signal rate distortion function, entropy and statistical encoding
3.6.3 Outline of image compression methods 77
EXERCISES
APPENDIX

1
Derivation of Eq. 3.2.31
Derivation of Eq. 3.2.44
Derivation of Eq. 3.2.45
Derivation of Eq. 3.3.11
Derivation of Eq. 3.3.31
Derivation of Eq. 3.3.38
Derivation of Eq. 3.5.24
Basics of statistical coding
REFERENCES
INDEX

Chapter 4. DISCRETE SIGNAL TRANSFORMATIONS


4.1. BASIC PRINCIPLES OF DISCRETE REPRESENTATION OF SIGNAL
TRANSFORMATIONS
4.2 DISCRETE REPRESENTATION OF THE CONVOLUTION INTEGRAL
4.2.1. Digital convolution
4.2.2. Treatment of signal borders in digital convolution
4.3. DISCRETE REPRESENTATION OF FOURIER INTEGRAL TRANSFORM
4.3.1. Discrete Fourier Transforms
4.3.2. 2D Discrete Fourier Transforms
4.3.3. Properties of Discrete Fourier Transforms
4.3.4. Discrete Cosine and Sine Transforms
4.3.5. Signal convolution in the DCT domain
4.3.6. DFTs and discrete frequency response of digital filter
4.4. DISCRETE REPRESENTATION OF FRESNEL INTEGRAL TRANSFORM
4.4.1. Canonical Discrete Fresnel Transform and its versions
4.4.2. Invertibility of Discrete Fresnel Transforms and frincd-function
4.4.3. Convolutional Discrete Fresnel and Angular Spectrum Propagation Transforms
4.4.4. Two-dimensional Discrete Fresnel Transforms
4.5. DISCRETE REPRESENTATION OF KIRCHHOFF INTEGRAL
4.6. HADAMARD, WALSH AND WAVELET TRANSFORMS
4.6.1. Binary transforms
4.6.2. Discrete Wavelet Transforms and multi-resolution analysis
4.7. DISCRETE SLIDING WINDOW TRANSFORMS AND ''TIME-FREQUENCY'' SIGNAL
REPRESENTATION
APPENDIX
Derivation of Eq. 4.2.11
Derivation of Eq. 4.3.1
Reasoning regarding Eq. 4.3.3
Derivation of Eqs. 4.3.8 and 4.3.9
The principle of Fast Fourier Transform Algorithm
Representation of Scaled DFT as convolution
Derivation of Eq. 4.3.24
Derivation of Eqs. 4.3.29 and 4.3.31
Derivation of Eq. 4.3.34.
Derivation of Eq. 4.3.36
Derivation of Eq. 4.3.39.
Derivation of Eqs. 4.3.41.
Derivation of Eqs. 4.3.43 and 4.3.45
Derivation of Eqs. 4.3.46
Derivation of Eq. 4.3.47
Derivation of Eq. 4.3.56
Rotated and Scaled DFTs as digital convolution
Derivation of Eq. 4.3.64
Derivation of Eq.4.3.69
Derivation of Eq. 4.3.75
Derivation of Eq. 4.3.89
Derivation of Eq. 4.4.95
Derivation of Eq. 4.4.23
Derivation of Eq. 4.7.4

2
EXERCISES
REFERENCES
INDEX

CHAPTER 5. DIGITAL IMAGE FORMATION AND COMPUTATIONAL IMAGING

5.1. IMAGE RECOVERY FROM SPARSE OR NON-UNIFORMLY SAMPLED DATA


5.1.1 Formulation of the task
5.1.2 Discrete Sampling Theorem
5.1.3 Algorithms for signal recovery from sparse sampled data
5.1.4 Analysis of transforms
5.1.5 Selection of transform for image band-limited approximation
5.1.6 Application examples
5.1.7 Discrete sampling theorem and compressive sensing

5.2. DIGITAL IMAGE FORMATION BY MEANS OF NUMERICAL RECONSTRUCTION OF


HOLOGRAMS
5.2.1 Introduction
5.2.2 Principles of hologram electronic recording
5.2.3 Numerical algorithms for hologram reconstruction
5.2.4 Hologram pre- and post-processing
5.2.5 Point spread functions of numerical reconstruction of holograms

5.3. COMPUTER GENERATED DISPLAY HOLOGRAPHY


5.3.1 3D imaging and computer generated holography
5.3.2 Recording computer-generated holograms on optical media
5.3.3 Optical Reconstruction of Computer-Generated Holograms

5.4. COMPUTATIONAL IMAGING USING OPTICS-LESS LAMBERTIAN SENSORS


5.4.1 Optics-less passive sensors: motivation.
5.4.2 Imaging as a parameter estimation task
5.4.3.Opticsless passive imaging sensors: possible designs, expected performance, advantages and
disadvantages

APPENDIX
Derivation of Eq. 5.2.28
Derivation of Eq. 5.2.44
Derivation of Eq. 5.2.50
Derivation of Eq. 5.3.12
Derivation of Eq. 5.4.5
Derivation of Eq. 5.4.6

EXERCISES
REFERENCES
INDEX

CHAPTER 6. IMAGE RESAMPLING AND BUILDING CONTINUOUS IMAGE MODELS

6.1. PERFECT RESAMPLING FILTER

6.2. FAST ALGORITHMS FOR DISCRETE SINC INTERPOLATION AND THEIR APPLICATION
6.2.1 Signal sub-sampling (zooming in) by means of DFT or DCT spectra zero padding
6.2.2 DFT and DCT based signal fractional shift algorithms and their basic applications
6.2.3 Fast image rotation using the fractional shift algorithms
6.2.4 Image zooming and rotation using Scaled and Rotated DFTs

6.3. DISCRETE SINC-INTERPOLATION vs OTHER INTERPOLATION METHODS:


PERFORMANCE COMPARISON

3
6.4. NUMERICAL DIFFERENTIATION AND INTEGRATION
6.4.1 Perfect digital differentiation and integration
6.4.2 Traditional numerical differentiation and integration algorithms versus DFT/DCT based ones:
performance comparison

6.5. LOCAL (Elastic) IMAGE RESAMPLING: SLIDING WINDOW DISCRETE SINC


INTERPOLATION ALGORITHMS

6.6 IMAGE DATA RESAMPLING AND IMAGE RECONSTRUCTION FROM PROJECTIONS


6.6.1 Discrete Radon Transform: an algorithmic definition and filtered back projection method for image
reconstruction
6.6.2 Direct Fourier method of image reconstruction
6.6.3 Image reconstruction from fan-beam projections

APPENDIX
REFERENCES
INDEX

CHAPTER 7. IMAGE PARAMETER ESTIMATION.


CASE STUDY: LOCALIZATION OF OBJECTS IN IMAGES

7.1. LOCALIZATION OF TARGET OBJECTS IN THE PRESENCE OF ADDITIVE GAUSSIAN


NOISE
7.1.1 Optimal localization device for target localization in non-correlated Gaussian noise.
7.1.2 Performance of ML-optimal estimators: normal and anomalous localization errors
7.1.3 Target object localization in the presence of non-white (correlated) additive Gaussian noise
7.1.4 Localization accuracy for the SNR-optimal filter
7.1.5 Optimal localization in color and multi component images
7.1.6 Object localization in the presence of multiple non-overlapping non-target objects

7.2. TARGET LOCALIZATION IN CLUTTERED IMAGES


7.2.1. Formulation of the approach
7.2.2. SCR-optimal adaptive correlator
7.2.3. Local adaptive SCR-optimal correlators
7.2.4. Object localization in blurred images
7.2.5. Object localization and edge detection. Selection of reference objects for target tracking

APPENDIX
7A. 1. Distribution density and variances of normal localization errors
7A. 2. Evaluation of the probability of anomalous localization errors
Derivation of Eqs. 7.2.18, 7.2.19 and 7.2.21

REFERENCES
INDEX

CHAPTER 8. IMAGE PERFECTING

8.1. IMAGE PERFECTING AS A PROCESSING TASK



8.2. POSSIBLE APPROACHES TO IMAGE RESTORATION FROM BLUR AND NOISINESS

8.3. MMSE OPTIMAL LINEAR FILTERS FOR IMAGE RESTORATION.
8.3.1Transform domain MSE-optimal scalar filters
8.3.2Empirical Wiener filters for image denoising
8.3.3Empirical Wiener filters for image deblurring

8.4. SLIDING WINDOW TRANSFORM DOMAIN ADAPTIVE SIGNAL RESTORATION
8.4.1Local adaptive filtering
8.4.2Sliding window DCT transform domain filtering

4
8.4.3Hybrid DCT/wavelet filtering.

8.5. MULTI-COMPONENT IMAGE RESTORATION AND DATA FUSION

8.6. FILTERING IMPULSE NOISE


8.7. CORRECTING IMAGE GRAY SCALE NONLINEAR DISTORTIONS

8.8. NONLINEAR FILTERS FOR IMAGE PERFECTING


8.8.1. Nonlinear filter classification principles
8.8.2. Filter classification tables and particular examples
8.8.3. Image display options for image enhancement

APPENDIX
Derivation of Eq. 8.3.6
Empirical estimation of variance of additive signal-independent broad band noise in images.
Derivation of equation 8.5.4
Verification of Eq. 8.5.19

EXERCISES
REFERENCES
INDEX

5
CHAPTER 1.

INTRODUCTION.

1.1 IMAGING GOES DIGITAL

The history of science is, to a considerable degree, the history of invention, development and

perfecting imaging methods and devices. Evolution of imaging systems can be traced

thousands of years back to rock engravings and to ancient mosaics (Figure 1. 1).

Figure 1. 1. Detail from the mosaic floor of the Petra Church

Apparently, the very first imaging devices were humans, painters. Great artists,

such as Leonardo da Vinci, Michelangelo, Albrecht Drer and many others not only created

outstanding masterpieces of art, but actually pioneered imaging science and engineering

(Figure 1. 2).

1
a) b)

Figure 1. 2. Drawing by Leonardo Da Vinci illustrating treatment of light and shade (a) and a woodcut by
Albrecht Drer showing an artist using Drers drawing machine to paint a lute (b)

The first artificial imaging device, was, seemingly, camera-obscura (pinhole

camera, Latin for dark room) that dates back to Aristotle (384-322 BCE) and Euclid (365-

265 BCE). Then, in XIII century, methods for polishing lenses were invented and eye glasses

obtained wide spread in Europe by mid of XV century. In the first years of XVII century a

decisive breakthrough happened, when Galileo Galilei (1564-1642) in October 1609, using,

apparently, his knowledge of laws of light refraction, greatly improved magnification of

three-powered spyglasses unveiled not long before in Netherlands and built a twenty-

powered instrument. He directed it to the sky and almost immediately discovered mountains

on the Moon, satellites of Jupiter, rings of Saturn, phases of Venus, nebular patches into stars.

It was the beginning of the Scientific Revolution of seventeenth century. Since that time the

pace of evolution of imaging science and imaging devices has become numbered in tens of

years rather than in centuries.

In the 1630-th, Ren Descartes published the Dioptrique (the Optics) that summarized

contemporary knowledge on such topics as the law of refraction, vision, and the rainbow.

In late 1660s Isaac Newton discovered that white light is composed of a spectrum of

colors and built his first reflecting telescope that enabled avoiding color aberrations and

2
building telescopes with much greater magnification than it was possible with refractive

Galilean telescopes.

In 1670s Robert Hook and Anthony Leeuwenhoek introduced a microscope, invented

new methods for grinding and polishing microscope lenses and discovered cells and bacteria.

The next decisive breakthrough happened in 1830-40s, when photography was

invented, which for the first time solved the problem of converting images into pictures that

can be stored, copied and sent by mail. This invention had a very profound influence to the

development of our civilization, from peoples everyday life to science and to art.

Photography has become a branch of industry and a profession that served peoples need to

memorize images. After invention of photography, the art of painting has become a pure art,

which stimulated the birth of new art trends, such as impressionism and photographic plates

have become major means in experimental science, which eventually led to almost all

outstanding discoveries made in XXIX-th and XX-th centuries.

In 1890s, photographic plates enabled discoveries of X-rays by Wilhelm Conrad

Roentgen (The Nobel Prize Laureate, 1901) and radio activity by Antoine Henri Becquerel

(The Nobel Prize Laureate, 1903) at the end of 19-th century. These discoveries, in their turn,

almost immediately gave birth to new imaging techniques, X-ray imaging and radiography.

1890s were remarkable years in the history of science. Apart from discoveries of X-

rays and radioactivity, these years were also years of discovery or radio, the major

breakthrough in communication and information transmission, and of invention of motion

pictures, which solved the problem of imaging of moving objects. With the invention of

motion pictures, the new, for imaging, principal concept of time sampled images was

introduced and realized.

X-rays were discovered by W.C. Roentgen in experiments with cathode rays. Cathode

rays were discovered in 1859 by German mathematician and physicist Julius Plcker, who

3
used vacuum tubes invented in 1855 by German inventor Heinrich Geissler. These tubes led

eventually to the cathode ray tube (CRT) invented by the Karl Braun in 1897 and finally

brought about the development of electronic television and electron microscopy.

In 1907, the Russian professor Boris Rosing used a CRT in the receiver of a television

system that, at the camera end, made use of mirror-drum scanning. Rosing transmitted crude

geometrical patterns onto the television screen and was the first inventor to do so using a

CRT. Then in 1920-40s Vladimir Zworykin, a former student of Rosing, working first for

Westinghouse and later for RCA in the US, and Philo Farnsworth, working independently in

San Francisco, brought about the birth of purely electronic television. Electronic television

has finally won in the 30 years of competition with electro-mechanical television based on

using, for image transmission, the image scanning disk, invented in 1883 by a German

student Paul Nipkow. This engineering solution turned to be a dead end in the evolution of

television, although the idea itself of converting images to time signals by means of image

row-wise scanning had a principal value and was eventually implemented in a much more

efficient way as image scanning by easily controlled electron beam. There is no need to tell to

the present generation, what role television plays in our civilization.

The victory of electronic television heralded the birth of electronic imaging. One

more outstanding representative of electronic imaging was electronic microscope. The first

electron microscope was designed in 1931 by E. Ruska. In 1986, E. Ruska was awarded the

Nobel Prize in Physics "for his fundamental work in electron optics, and for the design of the

first electron microscope". Electron microscopes depend on electrons rather than light to view

objects and because of this electron microscopes make it possible to view objects with a

resolution far beyond the resolution of optical microscopes.

Since 1940s, the pace of evolution of imaging devices has become numbered in years.

In 1947, British (native of Hungary) scientist Dennis Gabor while working to improve the

4
resolution of electron microscopes invented an optical method for recording and

reconstructing amplitudes and phases of coherent light radiation. He coined to this method

the name holography, meaning that it is the method for recording and reproducing whole

information carried by optical signals. For this invention he was awarded the Nobel Prize in

Physics in 1971.

In 1950s synthetic aperture and side-looking radars were invented, which opened

ways for creating images of objects in radio frequency band of electromagnetic radiation. The

principle of synthetic aperture radars is actually the same principle of recording of amplitude

and phase of radiation wave fronts as that of holography, and in 1962, Emmett Leith and Juris

Upatnieks of the University of Michigan, US, recognized from their work in side-looking

radar that holography could be used as a 3-D visual medium. They read Gabor's paper and

"simply out of curiosity" decided to duplicate Gabor's technique using the recently invented

laser and the "off-axis" technique borrowed from their work in the development of side-

looking radar. The result was the first laser transmission hologram of 3-D objects. These

transmission holograms produced images with clarity and realistic depth but required laser

light to view the holographic image. Also in 1962 on another side of the globe Yuri N.

Denisyuk from the Russian Academy of Sciences combined holography with 1908 Nobel

Laureate Gabriel Lippmann's work in natural color photography. Denisyuk's method

produced reflection holograms which, for the first time, did not need for image reconstruction

to use coherent laser light and could be viewed in light from an ordinary incandescent light

bulb. Holography was the first example of what we can call transform domain imaging.

In 1969, at Bell Labs, US, George Smith and Willard Boyle invented the first CCDs

or Charge Coupled Devices. A CCD is a semiconductor electronic memory that can be

charged by light. CCDs can hold a charge corresponding to variable shades of light, which

makes them useful as imaging devices for cameras, scanners, and fax machines. Because of

5
its superior sensitivity, the CCD has revolutionized the field of electronic imaging. In 2009

George Smith and Willard Boyle were awarded the Nobel Prize for Physics for their work on

the CCD.

In 1971-72s , British engineer Godfrey Hounsfield of EMI Laboratories, England, and

South Africa-born physicist Allan Cormack of Tufts University, Massachusetts, US, working

independently, invented a new imaging method, that allowed to build images of slices of

bodies from sets of their X ray projections taken from different angles. Reconstruction of

images from their projections required special computer processing of projections, and the

method obtained the name computer assisted tomography. Computer tomography

revolutionized medical imaging and in few years tens and later hundreds and thousands of CT

scanner were installed in medical hospitals all over the world. In 1979, Hounsfield and

Cormack were awarded the Nobel Peace Prize in medicine.

In 1979s, Gerd Binnig and Heinrich Rohrer, IBM Research Division, Zurich Research

Laboratory, Switzerland, invented the scanning tunneling microscope that gives three-

dimensional images of objects down to the atomic level. Binnig and Rohrer were awarded for

this invention the Nobel Prize in Physics in 1986. The powerful scanning tunneling

microscope is the strongest microscope to date. With this invention, imaging techniques,

which before this was based on electromagnetic radiation, conquered into the quantum world

of atomic forces.

But this is not the end of the evolution of imaging story. The years after 1970-1980s

were the years, when imaging began rapidly change from completely analog to digital. The

first swallow in this process was computed tomography, in which images of body slices are

computed from projection data, though its germ one can discover in methods of

crystallography emerged from the discovery of diffraction of X-rays by Max von Laue (the

Nobel Prize Laureate, 1914) in the beginning of 20-th century. Although Von Laues

6
motivation was not creating a new imaging technique, but rather proving the wave nature of

X-rays, photographic records of X ray diffraction on crystals, or lauegrams, had very soon

become the main imaging tool in crystallography because using lauegrams, one can

numerically reconstruct the spatial structure of atoms in crystals.

Digital imaging was born on the junction of imaging and communication. The first

reports on digital images date back to 1920s, when images were sent for newspapers by

telegraph over submarine cable between London and New York. Image transmission was

always a big challenge to communication because it required communication channels of

very large capacity. In 1950s, the needs to transmit television over long distances demanded

to compress TV signals as much as possible. By that time, first digital computers have

become available, at least for large companies and research institutions, and researches

started using them for investigations in image compression. For these purposes, first image

input-output devices for computers were developed in late 50s- early 60s. Satellite

communication and space research that have been sky rocketing since the first Sputnik in

1957, greatly stimulated works in digital image processing.

In 1964-71, computers were used at Jet Propulsion Lab. (Pasadena, California, US)

for improving quality of first images of the Moon surface transmitted, in digital form, from

the US space craft Ranger -7 and images of Mars transmitted from the US space-crafts

Mariner-4 (1964), Mariner 7 (1969) and Mariner 9 (1971). In 1973-1976 first color images

of Mars surface and first panoramas from Venus surface were published in USSR Academy

of Sciences, which were obtained using digital image processing of data transmitted from

space-crafts Mars-4, Mars-5, Venus 9 and Venus 10 lunched in the former USSR. By late

1970s, digital image processing has become the main tool in processing of satellite images

for space exploration and remote sensing.

7
The availability of digital computer by 60s and new opportunities this offered for

information processing could not pass yet another fast growing child of 60s, holography. In

1966-68s a German professor Adolf Lohmann, while working in San Diego University,

California, US, invented computer generated holograms and in late 60s early 70s the first

experiments with numerical reconstruction of optical holograms were reported. This was the

birth of digital holography.

By mid 1970s, first image processing mini-computer based workstations and high

quality grey scale and color computer displays were created. It would be no exaggeration to

say that needs for processing, storage and displaying images were one of main, if not the

major, impetuses in the developments of personal computers, which emerged in early 1980s

and became the main stream in computer industry. Late in 1980s early 1990s, the industry

of dedicated image processing boards for mini and personal computers emerged, and no PC

has been sold since that time without a video board.

The digital revolution affected the industry of image capturing devices as well. In

1972, Texas Instruments patented a film-less electronic camera, the first to do so. In August,

1981, Sony released the first commercial electronic camera. Since the mid-1970s, Kodak has

invented several solid-state image sensors that converted light to digital pictures for

professional and home consumer use. In 1991, Kodak released the first professional digital

camera system (DCS) with a 1.3 megapixel sensor. This has become possible not only

because megapixel CCD were developed by this time, but also because image compression

methods, foundations of which were laid in 50s - 60s, have reached a stage, when their

implementation has become possible thanks to emergence of the appropriate computational

hardware. In 1992, international Joint Photographic Experts Group issued the first JPEG

standard for compression of still images, and in 1993 the Motion Picture Expert Group issued

the first MPEG standard for compression of video. Nowadays digital photographic and video

8
cameras and JPEG and MPEG standards are overwhelmingly used in all range of image

related activities from space telescopes to mobile phones.

By the beginning of XXI-th century the era of photography and analog electronic

imaging gave place to the era of digital imaging. Images are now generated, stored,

transmitted and processed in a digital form. Digital imaging has won in the evolution of

imaging devices because it is much more cheap and versatile than the analog one and is

ideally suited for integration with different informational systems. The present author was

lucky to participate and to contribute to the process of digital imaging coming into being

from its very beginning in early 60s.

1.2 BRIEFLY ABOUT THE BOOK CONCEPT AND STRUCTURE

The father of information theory and, generally, of modern communication theory

Claude Shannon and his co-author H.W. Bode wrote in their paper A simplified derivation

of linear least square smoothing and prediction theory ([1]):

In a classical report written for the National Defence Research Council ([2]), Wiener

has developed a mathematical theory of smoothing and prediction of considerable

importance in communication theory. A similar theory was independently developed

by Kolmogoroff ([3]) at about the same time. Unfortunately the work of Kolmogoroff

and Wiener involves some rather formidable mathematics Wieners yellow-bound

report soon came to be known among bewildered engineers as The Yellow Peril

and this has prevented the wide circulation and use that theory deserves. In this paper

the chief results of the smoothing theory will be developed by a new method which,

while not as rigorous or general as the methods of Wiener and Kolmogoroff, has the

advantage of greater simplicity, particularly for readers with background of electrical

circuit theory. The mathematical steps in the present derivation have, for the most part,

9
a direct physical interpretation, which enables one to see intuitively what mathematics

is doing.

This approach was the present authors guide line in writing this book. The book concept is to

show that digital imaging and image processing is a science and not a mathematical

alchemy as it frequently appears and to present theoretical foundations of digital imaging as

comprehensively as possible avoiding as much as possible unnecessarily formidable

mathematics.

Including this first introductory chapter, the book consists of 8 chapters. The necessary

mathematical preliminaries are provided in a condensed form in the second chapter. Chapter

3 opens the basic contents of the book. It addresses the very first problem of digital imaging,

the problem of converting images into digital signals that can be stored, transmitted and

processed in digital computers. The next Chapter 4 is devoted to the problem of adequate

representation in computers of image transformations. The main emphasis is made on

keeping correspondence between original analog nature of image transformations and their

computer implementations. In Chapter 5, the concept of computational imaging is illustrated

by several instructive examples: image reconstruction from sparse or non-uniformly sampled

data, digital image formation by means of numerical reconstruction of holograms, formation

of virtual images by means of computer generated display holograms and computational

image formation from sensor data obtained without imaging optics. Chapter 6 introduces

methods for image prefect re-sampling and building, in computers, continuous image models.

Chapter 7 treats the fundamental problem of estimation, from image data, of numerical

parameters of objects present in the imaged scene. As a typical and representative example,

the task of optimal localization of targets in images is discussed. In Chapter 8, discussed are

methods of image perfecting and enhancement using linear and non-linear filtering. All

chapters are supplied with short introductory synopses.

10
In order to illustrate major results and facilitate their deeper understanding, the book

offers, in each chapter, a number of exercises supported by demo programs in Matlab. Those

results and algorithms that are not supported by a dedicated demo program are formulated in

a way that enables their easy implementation in Matlab as well.

The book is self-containing and does not overload readers by needs to refer to other

sources. As a rule, only references to classical mile stone publications are given to give due

credit to their authors and to acquaint readers with great names of giants on whose shoulders

they stand. For further Google search of additional information the reader might want key

words and index are provided in the text. In order to make facilitate reading and

understanding the book, all formula derivations that require more then 2-3 lines of equations

are moved to appendixes, where they are presented with all needed details without skipping

intermediate stages, and, as C. Shannon recommended, derivations have, for the most part,

direct physical motivation and interpretation.

11
References

1. H.W. Bode, C.E. Shannon, A simplified Derivation of Linear Least Square

Smoothing and Prediction Theory, Proceedings of I.R.E., Volume: 38 , Issue: 4

pp.: 417 425

2. N. Wiener, The Interpolation, Extrapolation, and Smoothing of Stationary Time

Series, National Defence Research Committee; reprinted as a book, J. Wiley and

Sons, Inc., New York, N.Y., 1949

3. A Kolmogoroff, Interpolation und Extrapolation von Stationren Zufllige Folgen,

Bull. Acad. Sci. (USSR) Sr. Math. 5, p. 3-14, 1941

12
Leonid P. Yaroslavsky, Fellow of the Optical Society of America, MS (1961,
Faculty of Radio Engineering, Kharkov Polytechnic Institute, Kharkov, Ukraine),
Ph.D. (1968, Institute for Information Transmission Problems, Moscow, Russia), Dr.
Sc. Habilitatus in Physics Mathematics (1982, State Optical Institute, S.-Petersburg,
Russia). From 1963 till 1983 he headed a group of Digital Image Processing and
Digital Holography at the Institute for Information Transmission Problems, Russian
Academy of Sciences, which in particular, carried out digital processing of images
transmitted from space ships Mars-4, Mars-5, Venera-9 and Venera -10 and obtained
first color images of surface of Mars and first panoramas from surface of Venus. From
1983 till 1995, he headed a Laboratory of Digital Optics at the Institute. From 1995
till 2008, he was a Professor at Faculty of Engineering, Tel Aviv University, where, at
present, he is a Professor Emeritus. He was also a visiting Professor at University of
Erlangen, Germany, National Institute of Health, Bethesda, MD, USA, Institute of
Optics, Orsay, France, Institute Henri Poincare, Paris, France, International Center
For Signal Processing, Tampere University of Technology, Tampere, Finland, Agilent
Laboratories, Palo Alto, Ca, USA, Gunma University, Kiryu, Japan, Autonomous
University of Barcelona. He supervised more than 20 Ph. D. candidates and is an
author and editor of several books and more than 100 papers on digital image
processing and digital holography.

Anda mungkin juga menyukai