Anda di halaman 1dari 8

A Study of Image Fusion Using

Wavelet-based Weighted-Average Technique


with Automatic Weight Adjustment
Wattanit Hotrakool
Department of Electronic and Electrical Engineering, University of Sheffield

I. INTRODUCTION
This coursework is given as a part of study in Computer Vision program. The aims of this coursework is to study
and implement the Wavelet-based image fusion technique, which is one of the most popular image fusion technique
for fusing two or more images together with very high quality. The objectives of this coursework are 1) to
understand the Discrete Wavelet Transform, 2) to implement the image fusion technique using Wavelet
transformation to fuse various types of images, and 3) to study the effect of different parameters and decision rules.

II. BACKGROUND
Image fusion is an attempt to combine information from two or more images of the same scene together into one
single image. The image that is the outcome of image fusion will be richer in information than any of the input
images. The image fusion can be used in the wide varieties of applications such as combining information from
many sensors or cameras together in surveillance mission, or remote sensing, to combine information from many
devices for better diagnosis in medical imaging, and to improve the quality of images by solving out-of-focus
problem or lighting problem for landscape pictures.
There are many kinds of input images for image fusion, for examples, multi-focus images, multi-sensors images,
and multi-exposure images, etc. The multi-focus images are the images that their focuses are on the different subject
of the same scene. The multi-sensors images are the images which taken from the same scene using a different kind
of device. CT - MRI images and Normal - IR camera are the examples of this kind of images. The multi-exposure
images are the images taken from the same scene with different amounts of light, so some image can be dark
whereas some can be bright.
There are many proposed technique for image fusion. These techniques can be divided into 2 categories which are
1) image fusion in a spatial domain. The examples of these techniques are the intensity weighted-average, and the
principle component analysis [1]. 2) Image fusion using transformation domain, there are many different approaches
in this category. The popular transformation approaches are the multi-resolution decomposition (e.g. Gaussian
pyramid, Laplacian pyramid), and the image fusion using the discrete wavelet transformation [2].
Normally, the wavelet-based fusion is the most popular among the above mentioned techniques because it
contains both multi-resolution property and also contains both structural and detail information of images. Therefore
the fused image that is the output of this technique mostly be higher quality than any other techniques in most
situations.
IIII. METHODOLLOGY
A. Discrrete Wavelet Traansform
The 2--dimensional discrete wavelet transform
m (2D-DWT), sometimes called waveleet decomposiition, is the
process too transform im
mages from a spatial domaiin into a waveelet domain [33]. The waveelet domain is the domain
that repreesents the wavvelet coefficiennts of the imaages. The waveelet decompossition can be ddone by passinng an image
into series of low-pass and high-passs filters as show
ws in figure.
LP Filter Low –Low
Low
frequency LL2 LH2
frequency
LH1
L
LP Filter Low –High
frequency HL2 HH2
HP
Image
LP Filter High –Low
HP Filter frequency
HL1 HH1
High High –High
frequency frequency
HP

Figure 1. Proceddure of 2-D wavelet transform Figure 2. The wavelet cooefficient domainn

((a) Spacial dom


main (b) Wavellet domain
Figure 3. The example of wavellet decompositionn

The waavelet coefficient clearly shhows the bothh low frequenncy componennts and high frequency
f com
mponents of
images. IIn contrast with Fourier Traansform that shows only frrequency infoormation, wavelet transform
m can shows
both frequuency (detail)) and structuree of the images in the same time. Becausee of this advanntage, the imaage fusion in
wavelet domain
d shoulld give betterr details than in the spatiaal domain, annd be more sstructural connsistent than
frequencyy domain.

B. Fusioon Rule
To do tthe image fussion, both imaages are fusedd together undeer some decission rules, the algorithm to decide how
the imagees will be fuseed. There are many proposeed techniques for decision rule.
r Among all
a techniques, the 3 most
basic techhniques to bee used as a deecision rule aare: 1) weightted-average 2)) maximum coefficient, and 3) sliding
window. Weighted-aveerage is to takke the average of all images to be an outpput. Maximum
m coefficient iss to take the
input withh the highest value
v to be thee output. Andd the sliding w
window is to doo either weighhted-average or
o maximum
coefficiennt to small, loccal regions off the images. All
A of these techniques can be apply to booth intensity-bbased fusion
and waveelet-based fusion.
In this coursework we do the wavelet-based image fusion, therefore, the decision rule chose for this coursework
will be applied to fuse the wavelet coefficient of images instead of intensities. According to some research on image
fusion [4], [5], showing that the weighed-average are the best fusion technique among all basic fusion technique in
general case. So the wavelet weighted-average technique is chosen as the fundamental fusion rule in this
coursework.

C. Automatic weight adjustment


Because the performance and quality of the fused image that is a result from wavelet-based weighted-average
image fusion technique is largely depends on weights of each instance. The automatic weight adjustment technique
is proposed in order to find the appropriate weights of any pair of input images, to make the most reasonable fused
image. The term reasonable image here is defined as the image which contains appropriate amount of information
from all its inputs. In this case the reasonable image may not contain some information that may overpower the
smaller details from other image, trying to preserve small details as much as possible. The reasonable image in this
coursework is defined as the output image that has the highest value of structural similarity index (SSIM) that is
= ∈ ( ( , ) + ( , ))
where A, B are input image, O is output image, and I is a set of output from all weight coefficient. The structural
similarity index used here is the statistical index used for measure the similarity between any two images. The result
from SSIM is in range of 0 (totally different) to 1 (totally the same) [6]. The SSIM has more advantages than the
traditional way to measure image similarity such as Mean-squared Error (MSE). One of the advantages is SSIM
result in more consistent with visual inspection, i.e. likely the same with the human eyes. The SSIM is defined as
(2 + )(2 + )
( , )=
( + + )( + + )
where is the average of x, is the average of y, is the variance of x, is the variance of y, is the
covariance of x and y, =( ) and =( ) where L is the dynamic range of images (in this case is 28) and
k1 and k2 are 0.01 and 0.03 accordingly.

III. DESIGN
In this coursework, the image fusion is implemented in MATLAB program. The structure of program is shown in
figure . The program consists of two main parts; 1) the image fusion algorithm, and 2) weight adjustment algorithm.
The image fusion algorithm is just like other normal wavelet weighted-average. The weigh adjustment algorithm is
composed of the computing of SSIM over the entire range of weight and the weight selection that select the
appropriate weight based on the computed SSIM. This weight will be assigned to the weighted-average fusion rule.
However, since this technique rely on the similarity between input and output image, so the inverse discrete
wavelet transform (IDWT) is needed to compute in every loop of weight-varying process. Hence made the overall
speed of algorithm reduced.
In the design of this algorithm, the mechanism to handle with various input colour space was also included. So
every type of input image, especially the index colour, will be converted to Grayscale before combining and before
computing SSIM.
Figgure 4. Algorithm
m of proposed wavvelet weighted-avverage with autom
matic weight adjusstment

IV. RESULTT

(a) (b) (a) (b)

(c) (d) (c) (d)


Figure 5. Result of multi-fo
focus images (a) innput image A, (b)) input Figuure 6. Result of meedical images (a) input image A, (bb) input image
image B, (c) fused image, ((d) plot between w
weight and SSIM M index B, (c) fused imagge, (d) plot between weight and SSSIM index

(a) (b) (a) (b)


(c) (d) (c) (d)
Result of multi-seensors images (a) input image A, (bb) input
Figure 7. R Figuure 8. Result of muulti-temporal imaages (a) input imagge A, (b) input
image B, (c) fused image, ((d) plot between wweight and SSIM
M index imaage B, (c) fused immage, (d) plot bettween weight andd SSIM index

V. DISCUSSIO
ON

As seenn from the above result, thee results from


m this techniquue should be said to be “moore natural”, w
which means
they quitte look like iimages taken physically. Even
E though these images are very com
mfort to see for human,
sometimees they didn’t contain so muuch informatioon. This can bbe seen from figure
f , and fiigure , that ressults of both
medical iimage and sceenery image took
t a large portion
p of infformation from
m the image tthat is more iinformative.
However these results are said to bee more beautiful than the fu
fused image thhat is flooded with large, saame priority
information.
This teechnique is alsso has drawbaack as followss; 1) the high frequency infformation of bboth images are
a averaged
together ttherefore somee sharp detailss will be reducced, especiallyy in multi-focuus image, andd 2) the input images
i need
to be regiistered prior too the algorithm
m. If the inputt images are noot registered, tthe computation of SSIM w
would be fail
and makee entire algoritthm to be failuure.
Duringg this evaluatiion, the waveelet decompossition of levell 3 is used foor limiting the energy affeected during
running [7]. However, in case that w
wavelet decomp
mposition level is changed, thhe result and qquality of the fused image
will slighhtly change; heence the SSIM
M and the final weight may also
a slightly chhanges.

V CONCLUSIION
VI.
The waavelet-based image fusion is
i a very highh quality imagge fusion technnique. In this coursework tthe wavelet-
based weeighted-averagge fusion technnique with auutomatic weighhted adjustment algorithm is
i studied. This algorithm
proposed under the asssumption that the best fusedd output shouldd be the outpuut that most siimilar to both of the input
images. Therefore
T the computation oof SSIM is inttroduced to m
measure this sim
milarity and ffind the best weight
w value
for each ppairs of input images.
i

REFERENCES
[1] Moham med R. Metwalli,, Ayman H. Nasr,, Osama S. Farag Allah, and S. El-R Rabaie, “Image Fusion
F Based on PPrincipal Componnent
Analyysis and High-Passs Filter,” IEEE, 2009.
[2] Guoliaang Tang, Jiexin P Pu and Xinhan HHuang, “Image Fussion Based on Muulti-wavelet Transsform,” Proc. in IIEEE Internationaal Conference
on Meechatronics and A Automation, 2006..
[3] Gonzaalo Pajares and Gonzalo Pajares, “A A wavelet-based image fusion tutoorial,” Pattern Reccognition 37, 20004, pp. 1855 – 18772.
[4] Anjalii Malviya, and S. G. Bhirud “Multii-Focus Image Fuu sion of Digital Im
mages,” Internatiional Conference on Advances in R Recent
Technnologies in Communication and Coomputing, 2009.
[5] Anjalii Malviya, and S. G. Bhirud “Waveelet Based Multi-FFocus Image Fusiion,” International Conference onn Methods and Moodels in
Computer Science, 20009.
[6] Zhou W Wang, Alan Conrrad Bovik, Hamidd Rahim Sheikh, anda Eero P. Simooncelli, “Image Quuality Assessment: From Error Vissibility to
Structuural Similarity,” IIEEE Transl. IMA
AGE PROCESSIN NG, vol. 13, no. 44, April 2004.
[7] Xing Su-xia,
S Guo Pei-yyuan, and Chen Tian-hua, “Study oon Optimal Wavellet Decomposition Level in Infrareed and visual Lighht Image
Fusionn” International CConference on Meeasuring Technollogy and Mechatrronics Automationn, 2010.
APPENDIX
MATLAB CODE FOR IMAGE FUSION

function [I, OW, OWTable] = DoFusion(A, B, varargin)


% Fuse 2 images together into one single image using automatic weight
% adjustment and structural similarity index
%
% Usage: [I, OW, OWTable] = DoFusion(A, B)
% [I, OW, OWTable] = DoFusion(A, B, s)
% [I, OW, OWTable] = DoFusion(A, B, s, step)
%
% A, B are input images , can be pre-read matrices or file names.
% s is a wavelet decomposition level, default s = 3
% step is a step size of vary weight , default step = 0.05
% I is a output fused image
% OW is a value of optical weight using for image I
% OWTable is a table to store value of all weight-SSIM pair
%
% Author: Wattanit Hotrakool
% The University of Sheffield
% Registration No: 100135895
%
% Date 16/11/2010

% Read input images


img1 = input_translation(A);
img2 = input_translation(B);

% Get the decomposition level


s = get_decomposite_level(varargin);

% Get the step size


step = get_stepsize(varargin);

% Perform the wavelet transform


img1_wl = F2DWT(img1,s);
img2_wl = F2DWT(img2,s);

% Declare a OWTable
OWTable = zeros(1+1/step,2);
idx = 1;

for Weight = 0:step:1

% Compute the SSIM for each value of weight per each iteration
out_wl = Weight*abs(img1_wl) + (1-Weight)*img2_wl;
tempout = I2DWT(out_wl,s);
% Compute the SSIM
SSIM = FindSSIM(tempout,img1)+FindSSIM(tempout,img2);
% Record the weight and SSIM
OWTable(idx,1) = Weight;
OWTable(idx,2) = SSIM;
idx = idx+1;
end

% Find the weight corresponding to maximum SSIM


[~,i] = min(OWTable(:,2));
FuseWeight = OWTable(i,1);

% Use optimum weight to fused and reconstruct image.


out_wl = FuseWeight *img1_wl + (1-FuseWeight)*img2_wl;
out = I2DWT(out_wl,s);

I = uint8(out);
OW = FuseWeight;

figure(10), imshow(A),title('Input A');


figure(20),imshow(B),title('Input B');
figure(30), imshow(I),title('Fused Image');

end

function out = input_translation(in)


% load input image into a matrix.
% If the input image is specified as a file name, load image using
% imread command.
% If the input image is in index color space, convert the image into
% grayscale.

if (ischar(in))
[out,map] = imread(in);
if size(map) ~= 0
out = ind2gray(out,map);
end
else
out = in;
end

out = im2uint8(out);

end

function s = get_decomposite_level(inarg)
% Set the wavelet decomposite level

nofarg = size(inarg,2);
s = 3;
if nofarg >= 1; s = inarg{1}; end;
end

function step = get_stepsize(inarg)


% Get the step size
nofarg = size(inarg,2);
step = 0.05;
if nofarg >= 2; step = inarg{2}; end;
if step > 1 || step < 0; step = 0.05; end;
end
function SSIM = FindSSIM(im1,im2)
% Compute the value of structural similarity (SSIM)
% If the input image is in RGB space, convert into grayscale.

if isrgb(im1); im1 = rgb2gray(im1); end


if isrgb(im2); im2 = rgb2gray(im2); end

ux = mean2(im1);
uy = mean2(im2);
sdx = std2(im1);
sdy = std2(im2);
varx = sdx^2;
vary = sdx^2;
corr = corr2(im1,im2);
cov = corr*sdx*sdy;
L = 255;
k1 = 0.01;
k2 = 0.03;
c1 = (k1*L)^2;
c2 = (k2*L)^2;
SSIM = (2*ux*uy+c1)*(2*cov+c2)/((ux^2)+(uy^2)+c1)*(varx+vary+c2);

end

function [result] = isrgb(img)


% Check whether the image is in RGB or not.

sz = size(size(img));
result = 0;
if sz(2) == 3; result = 1; end

end

Anda mungkin juga menyukai