Anda di halaman 1dari 12

1

Implementation and analysis of the effects of white balance and


light source to images
Floyd Willis Patricio
*floydpatibardolaza@gmail.com

Abstract
This paper presents the effects of light sources to a produced image
and how are these images enhanced through implementation of white
balancing via digital camera white-balancing settings, implementation of
white patch algorithm (WPA) and gray world algorithm (GWA) through
Scilab programming. It was found out that the white balance setting of a
camera should agree with the CCT or correlated color temperature of the
light source to obtain an image with an enhanced color quality. WPA is an
excellent technique in white balancing an overlaid color image and GWA is a
better technique in white balancing an old photograph. Also, the color
perceived in an object was rendered from the spectral information of the
object. The rendered colors were not exactly the same but at least close to the
expected rendered colors which would account to the theory that spectral
information of an object highly affects the perceived color of the object.

1. Introduction
The color that the human eye sees is affected by three vital things: (a) the light source that illuminates
the object, (b) the surface of the object and (c) the sensitivity of the eye or the sensor [1]. A slight attenuation of
one of these three may bring a lot of changes to the output color of the object.

When an object is illuminated by a light source, what the human eye perceives is the light reflected by
the surface of the object:
L () = R()I () (1)
where I is the illumination and R is the reflection of the object. It must be noted that the illumination
has little spatial variation, while the reflectance may have drastic spatial changes representing the details of the
objects surfaces such as texture, smoothness etc. [2].
For a colored digital image, its details are represented as an array of pixels overlaid as three different
channels: Red channel, Green channel and Blue channel. The color of each pixel is an integral of the product of
the spectral power distribution of the incident light source S (), the surface reflectance () and the spectral
sensitivity of the camera () for the three different channels. This means that the color of a pixel per i channel
is represented by the equation:
i = K
i
()S()
i
()() (2)
() represents the interval at which the spectral measurements are made [3].
When an object is illuminated by different light sources, its color may appear different due to the color
temperature difference of the light sources used. This is because the reflection spectrum of the object is shifted
from its true color. This explains why a white object appears reddish when illuminated with low color
temperature light source such as an incandescent bulb. This may not be observed by the human eye but for
digital recording via a camera, it can be recorded. To compensate this change in color, white balancing
mechanism is employed [4].
This paper presents the study of the effects of light sources to a produced image and how are these
images enhanced through implementation of white balancing. The last part of this paper explains how the
spectral information of an object affects the color of the object.

2. Theory
2.1 Gray World Algorithm
Two white-balancing algorithms were utilized in this study. The first method makes use of the Gray-
World algorithm which works under the assumption that the average reflectance of a scene is achromatic. This
means that under this algorithm, the average of the RGB channels in a given scene is roughly the same. The gain
factor can be therefore adjusted to two of the channels so that both of the means will then be equal to a reference
channel which is just the average of the average values of each of the RGB channel.
Given an nxn RGB(x, y) image of x and y indices for its pixel location, the individual RGB color
components of the image then are R
sensor
(x, y), G
sensor
(x,y) and B
sensor
(x,y), respectively.
2


R
avg
=

(, )

=

G
avg
=

(, )

=
(3)
B
avg
=

(, )

=


The gray world assumption is already satisfied when the three values are identical. The reference component
will be the mean of R
avg
, G
avg
and B
avg
we can denote as RGB
mean.
The gain can be computed by dividing each
average values with RGB
mean.
So the gains for the different channel may be computed from the equations

=

(4)

From these values of gains, the corrected image can be attained by multiplying the respective gain
to the original array of each channel [5].

Rcorrected(x, y) = Rsensor
Gcorrrected = Gsensor (5)
Bcorrected = Bsensor

2.2 White Patch Algorithm

For second white-balancing method is based on the Retinex theory which argues that the perceived
white is associated with the maximum cone signals. To implement this method, a white patch (hence the name
of the algorithm) must be extracted from a unwhite-balanced photo. This white patch shall be the one perceived
as white by the real world. It doesnt have to be the pixel with the maximum RGB value because of the tendency
of having very few maximum pixels that may not efficiently give a satisfactory divider for the whole image.
From a white patch (WP), the average value of its RGB components must be computed.

Rave = =

(, )

=

Gave = =

(, )

=
(6)
Bave = =

(, )

=


where Rave, Gave and Bave are the averages of the respective RGB channel arrays of the white patch. These
values will be used as dividers to the array of the original matrix of the image to obtain the corrected image [5].

Rcorrected(x, y) =

Rsensor
Gcorrrected (x, y) =

Gsensor (5)
Bcorrected (x, y) =

Bsensor

To further improve the quality of the corrected image, if Rcorrected, Gcorrected and Bcorrected are
greater than 255, then it must be equated to 255.


3. Methodology
This study primarily aims to analyze the effects of light sources to a produced image and how are these
images enhanced through implementation of white balancing. Several aspects were considered to advance the
scope of this study and these were divided into four parts.

3.1 Montage of Images under different White Balance settings and different light sources
The first part of this study focuses on the effects of different white balancing techniques to varying
color correlated temperature (CCT) of the light source. Objects of varying colors were collected and assembled
under one type of light source. Using a digital camera, a Sony Nex 5N camera, these objects were captured in a
photo for different white balancing settings available in the camera. The white balancing settings used were
Incandescent, Fluorescent Warm White, Fluorescent Cool White, Fluorescent Day White, Fluorescent
Daylight, Sunny, Shade, Cloudy and the Automatic White Balance (AWB).
Afterwards, the method was repeated for different light sources. The sources used in this study were as
3

follows: Cloudy, Fluorescent, Sunny and Incandescent. After capturing the photos, a montage was created
wherein the photos were arranged in an array. Each row was images of the scene taken under same light source
and each column were scenes captured under each white balance setting. The column and row were ordered in
increasing color correlated temperature.
The range of colors produced by the camera for every image was analyzed. It shall be taken note that
AWB does not have a fixed CCT for in this technique, the camera adjusts automatically depending on its
perceived light from the surroundings. Images taken under AWB were used as comparison to best images from
the different white balanced photos.

3.2 Implementation of White Patch and Gray-World Algorithm to an overlaid image
An overlaid RGB image from the previous activity (Activity 4) was used for the implementation of
White Patch and Gray-World Algorithm. The quality of the produced image after white balancing was compared
from the original image. Also, the utilization and result of using the two white balancing algorithms were
compared. The algorithm was carried out using Scilab programming.

3.3 Implementation of White Patch and Gray-World Algorithm to an old photograph
The second and third part of the study aimed for the implementation and comparison of two commonly
known algorithms used in white balancing an image. Using a scanned old photograph, the two white balancing
algorithms were implemented. The quality produced images after and before the processing was compared and
analyzed. The algorithm was carried out using Scilab programming.

3.4 Rendering color from an objects spectral information
For the last part of the activity, it aims to render color of an object by computation of its RGB values
from its emittance-reflectance spectra and camera sensitivity spectra. The emittance-reflectance spectra were
captured using a Jaz spectrometer and the camera sensitivity spectra were taken from the web. Figure 1 displays
the set-up used in gathering the emittance-reflectance spectra of the object. For this study, three objects were
used: a red spectrometer packaging box, a blue canister and an apple-green notebook.
The spectral values of each object were processed in Microsoft Excel using equation 1 to get the RGB
values of the object. A square patch in MS Powerpoint was created rendering the computed RGB values. The
rendered color was compared from the actual color of the object from a taken photo of the object. The RGB
values of the actual object were determined from Photoshop.


Figure 1. Set-up used to get emittance - reflectance spectra for the part 4 of the
activity.
4. Results and Dicussion
4.1 Montage of Images under different White Balance settings and different light sources

For the first part of the activity, the montage produced is shown in Figure X. It must be noted that the
rows were arranged in increasing CCT of the white balance settings from left to right and the columns were
arranged in increasing CCT of the light sources from top to bottom. The Automatic White Balance (AWB)
setting was used for comparison among the images in the array because for this setting, the camera adjusts itself
with respect to its perceived CCT of the scenery and does not have a fixed CCT value.
For the image taken under Incandescent light (Row 1), it can be obviously seen that the best image
produced was that of image set at Incandescent White Balance Setting and the poorly white balanced image is
that of the Cloudy White Balance setting. Figure 2 shows the comparison of the expanded histograms of the
images set at Incandescent WB setting and that of the image set at AWB. It can be observed that the photo set to
Incandescent white balance is almost identical to the image under AWB. The image set at cloudy white balance
obviously suffered in poor white balancing of the image. This is expected since the white balance setting does
not agree with the light source used. It can be observed from Figure 2 that histogram of the red channel of the
image taken under cloudy WB did not peak at the left portion as compared to the properly white balanced
images though there is high intensity peaks at the left portion of the blue channel of the image. This is expected
4

since for poorly white balanced images, the image appeared yellowish due to the very yellow illumination of the
incandescent bulb used as the light source.
For the image taken under sunlight (Row 2), it was expected that the image set under Sunny WB will
be the best image since the color temperature of the sunny light agrees with the white balance setting. The
poorly white balanced image will that of the image set under incandescent WB since the image can be perceived
as bluish. The comparison of the images histograms taken under sunlight set at AWB, Sunny WB and
Incandescent WB are compared in Figure 3. It can be observed that AWB and Sunny WB images are almost the
same. The poorly white balanced image under Incandescent WB setting can be observed to lose some red
channel values.
For the next set of images taken under fluorescent light, it can be observed that Fluorescent - Daylight
was the best photo produced. These photo were also comparable to that of the image set at AWB. Also, the
photo taken under Incandescent appeared bluish. The comparison of the histograms of the produced images
under Fluorescent (Daylight) WB settings and AWB and Incandescent are presented in Figure 4.
For the last row of images in the created montage taken under a Cloudy light source, the pictures
seemed to be clear and properly white balanced for images set from Fluorescent Day White to Cloudy and the
AWB though the Incandescent and Fluorescent Cool White balance settings made the images to turn a little
bluish, thus the poor histogram show in Figure 5 which also shows the RGB histograms of images under Cloudy
light source and under AWB and Cloudy WB settings.































Figure 2. Comparison of the RGB channel histograms of the image set under AWB (left) and Incandescent WB
(center) and poorly white balanced image Cloudy (right) all taken under incandescent light.

5





















Figure 3. Comparison of the RGB channel histograms of the image set under AWB (left) and Sunny
WB (center) and poorly white balanced image Incandescent (right) all taken under the sunlight.
























Figure 4. Comparison of the RGB channel histograms of the image set under AWB (left) and
Fluorescent Daylight (center) and poorly white balanced image Incandescent (right) all taken under the
fluorescent light source.


6























Figure 5. Comparison of the RGB channel histograms of the image set under AWB (left) and Cloudy
(center) and poorly white balanced image Incandescent (right) all taken under the fluorescent light source.



4.2 Implementation of White Patch and Gray-World Algorithm to an overlaid image

Figure 6 shows the result of utilizing White Patch algorithm (WPA) to the overlaid image from Activity 4. It
can be observed that the white balanced image improved in the color quality of the image. The true color of the
objects came forth and became more emphasized. It can also be observed that the white portions of the image
were very bright. This is due to the assignment of 255 to all pixel values which gained a value greater than 255
after multiplying 255 over the average value of the white patch to the array. This can be observed by the
increase of 255 values in the grayscale histogram of the image as shown in Figure 7.
Gray World Algorithm (GWA) allowed proper white balancing of the original image but retained the dim
ambience of the image because of the lack of bright pixels. It can be observed from its histogram from Figure 7
that there were no 255 values for any pixel in the image. Actually, the histogram quite resembles the original
images but only differs with the fact that the range of gray values for the white-balanced image were stretched
to more gray values
Comparing the two used algorithms, it may be concluded that for this part the WPA gave a better result as
compared to the GWA.



7







Figure 6. The montage of pictures of colored items taken under different light sources (rows) set for different white balance settings (columns). Each row and column are arranged
in increasing Correlated Color Temperature (CCT).

**F in F(Warm White), F( Cool White) etc. means fluorescent.
8



















Figure 7. The unwhite-balanced overlaid image from Activity 4 (top left), the white-balanced image
(top right) using White Patch algorithm (WPA) and the white balanced image (bottom) using Gray World
Algorithm (GWA).





Figure 8. Comparison of the gray histograms of the original overlaid image from Activity 4 (right),
image balanced using WPA (center) and image white-balanced using GWA.



4.3 Implementation of White Patch and Gray-World Algorithm to an old photograph

Figure 8 shows the result of utilizing White Patch algorithm (WPA) to a scanned old photograph. It shall be
noted that comparing this parts raw image from that of the overlaid image, the original image this time has
white pixels having values closer to 255. This means the white pixels in this image is whiter than that of the
previous. That is why it is expected that imposing white balancing to the image would just result to only minor
9

difference from that of the original. The most obvious change in the image after imposing WPA will be the
brighter photo. The shirt of the male kid was relatively whiter and brighter than the original. It can be also
observed that some details from the white background were diminished due to the very bright white pixels
produced.
Gray World Algorithm (GWA) allowed proper white balancing and just like the previous part, it did not
allow the production of bright pixels in the image. The image changed its appearance from being a little reddish
to greenish. Comparing the two used algorithms, it may be concluded that for this part the GWA was produced a
better image for the image was not too bright and did not diminish some details from the photograph.















Figure 9. The original version of the old photograph (right), the white-balanced version of the image
(center) using White Patch algorithm (WPA) and another version of the image (left) using Gray World Algorithm
(GWA).

Comparison of the histograms of images from Figure 8 is shown in Figure 9. It can be observed that the
histograms for the original image and the produced image after imposing GWA were somehow alike though the
histogram of the image produced using WPA was very different. Almost all values were concentrated on the
high gray values for this image. It also incurred a large amount of 255 values which is quite observable in the
very bright image.



Figure 10. Comparison of the gray histograms of the original the old photograph (right), the white-
balanced version of the image (center) using White Patch algorithm (WPA) and another version of the image
(left) using Gray World Algorithm (GWA).




4.4 Rendering color from an objects spectral information

Three items of different colors were taken emittance-reflectance spectra to be able to render their
respective colors. Figure 10 shows the comparison of the calculated and the actual color of the red
spectrometers packaging box. The rendered red color from the calculation can be observed as somehow close
the actual. The actual color of the red spectrometer was determined via color picking tool of the Photoshop
wherein this feature allows the display of the RGB value of a specified pixel in an image. The graph of the
comparison of their RGB values shows close similarity as well.
10















Figure 11. The calculated color patch of the red spectrometer and the actual color patch of the spectrometer (top left), a photograph of the
actual spectrometer (bottom left) and the comparison of calculated and actual RGB values of the rendered colors (right).



Just like the result from the spectrometer, RGB calculation from the spectral information gave a color
that closely resembles that of the actual color of the canister. The graph in Figure 11 shows that the RGB values
of the calculated and actual color of the canister were in almost same amount relative to the other channels.

















Figure 12. The calculated color patch of blue canister and the actual color patch of the item (top left), a photograph of the actual blue
canister (bottom left) and the comparison of calculated and actual RGB values of the rendered colors (right).


Figure 12 shows the rendered color of the apple-green notebook used in the activity. It can be observed
that the rendered color was slightly far from what was expected. There was an increase of the red channel
component compared to the green channel that made the calculated color as slightly brownish though blue
channel component was in accordance to that of the actual which was very low compared to the other channels.








Calculated
0
50
100
150
200
250
Calculated
RGB Values
Actual RGB
Values
Actual
Calculated Actual
0
20
40
60
80
100
120
140
red green blue
Calculated
RGB Values
Actual RGB
Values
11














Figure 13. The calculated color patch of green notebook and the actual color patch of the item (top left), a photograph of the notebook
(bottom left) and the comparison of calculated and actual RGB values of the rendered colors (right).

From the gathered results from this part of the activity, the rendered colors for the red spectrometer and
the blue canister showed good results. The rendered colors were not exactly the same but at least close to the
expected rendered colors. This proves that the spectral information of an object can be utilized to replicate its
color. This also agrees with the fact that the light source and reflectance of the surface and the sensitivity of the
camera highly affects the quality of the image.

5. Conclusion

This paper presents the study of the effects of light sources to a produced image and how are these
images enhanced through implementation of white balancing. It was found out that in using a digital camera as a
sensor, and white-balancing to enhance the image being taken it is necessary to use a white balance setting
having approximately same correlated color temperature with the light source. A montage was constructed to
demonstrate how the color quality of the image is affected if the white balancing setting is not in accordance
with the CCT of the light source.
Two white-balancing techniques were implemented to an overlaid colored photo and to an antique
photograph. For the colored photo, White Patch Algorithm (WPA) showed excellent results. The corrected
image was bright and the color quality of the image was more enhanced and detailed. For the old photograph,
Gray World Algorithm (GWA) showed a better result because the photo was not too bright that prevented the
diminishing of some information in the photograph as supported by the histogram plot.
For the last part of the activity, the rendered colors for the red spectrometer and the blue canister
showed good results. The rendered colors were not exactly the same but at least close to the expected rendered
colors. This proves that the spectral information of an object can be utilized to replicate its color. This also
agrees with the fact that the light source and reflectance of the surface and the sensitivity of the camera highly
affects the quality of the image.


References
[1] Hecht, Eugene, Optics, 2nd Ed, Addison Wesley, 1987
[2] Lecture Handouts. E180. What determines the color we see?. Harvey Mudd College obtained from <
http://fourier.eng.hmc.edu/e180/lectures/color1.pdf> last 10 July 2013.
[3] Lecture handouts. Applied Physics 187, National Institute of Physics, 1
st
sem 2013-2014.
[4] Liu, Yung Cheng et. al, Automatic White Balance for Digital Camera. IEEE Transactions
on Consumer Electronics, Vol. 41. No. 3. August 1995.
[5] Lam, Edmund Y. and Fung, George S.K. Automatic White Balancing in Digital
Photography. Taylor and Francis Group LLC. Copyright 2009.




0
50
100
150
200
250
R
e
d
G
r
e
e
n
B
l
u
e
Calculate
d RGB
Values
Actual
RGB
Values
Actual
Calculated
12

Appendix (Scilab Code)

White Balance Algorithm

stacksize(200000000)
//g is the extracted white patch from the image
g = imread('C:\Users\Christian\Dropbox\Acads\1st sem 2013\AP 187\act 5\wold.jpg')
r = mean(double(g(:,:,1)));
gr = mean(double(g(:,:,2)));
b = mean(double(g(:,:,3)));

mat = imread('C:\Users\Christian\Dropbox\Acads\1st sem 2013\AP 187\act 5\oldpic.jpg');
orig = mat

s = double(mat(:,:,1))*(255/r);
s(find(s>255)) = 255
mat(:,:,1) = s
w = double(mat(:,:,2))*(255/gr);
w(find(w>255)) = 255
mat(:,:,2) = w
q = double(mat(:,:,3))*(255/b)
q(find(q>255)) = 255
mat(:,:,3) = q

wb = [orig mat]
imshow(wb)

imwrite(wb, 'whitepatcholdpic.png')

scf(0); imhist(rgb2gray(orig), 256, '');
scf(1); imhist(rgb2gray(mat), 256, '');


Gray World Algorithm

stacksize(200000000)
mat = imread('C:\Users\Christian\Dropbox\Acads\1st sem 2013\AP 187\act 5\oldpic.jpg');

orig = mat

R = mat(:,:,1);
[countr, cellsr] = imhist(R, 256, '');
G = mat(:,:,2);
[countg, cellsg] = imhist(G, 256, '');
B = mat(:,:,3)
[countb, cellsb] = imhist(B, 256, '')
aveR = sum(countr.*cellsr)/sum(countr)
aveG = sum(countg.*cellsg)/sum(countg)
aveB = sum(countb.*cellsb)/sum(countb)
ave = (aveR + aveG + aveB)/3

Ralpha = ave/aveR
Galpha = ave/aveG
Balpha = ave/aveB

mat(:,:,1) = double(R)*Ralpha;
mat(:,:,2) = double(G)*Galpha;
mat(:,:,3) = double(B)*Balpha;

gw = [orig mat]
imshow(gw)

imwrite(gw, 'grayworldoldpic.png')
scf(6); imhist(rgb2gray(mat), 256, '');
disp(aveR, 'R', aveG, 'G', aveB, 'B')

Anda mungkin juga menyukai