Manuscript Draft
Manuscript Number: ASOC-D-11-00203
Title: Robust Digital Watermarking for Compressed 3D Models Based on Polygonal Representation
Article Type: Full Length Article
Keywords: Rubost Watermarking, Spherical Wavelet Transformation, Artificial Intelligent, Multi-layer
Feed Forward Neural Network, Attacks, Butterfuly Transform method, Lossy Compression, BER
Abstract: Multimedia has recently played an increasingly important role in various domains, including
Web applications, movies, video game and medical visualization. The rapid growth of digital media
data over the Internet, on the other hand, makes it for everybody to access, copy, edit and distribute
digital contents such as electronic documents, images, sounds and videos. Motivated by this, much
research work has been dedicated to develop methods for digital data copyright protection, tracing the
ownership and preventing illegal duplication or tampering. This paper introduces a methodology of
robust digital watermarking that applied on compressed polygonal 3D models, which is based on the
well-known spherical wavelet transformation after producing the compressed 3D model using neural
network, and we proved in this work that applying watermarking algorithm on compressed domain of
3D object more effective, efficient and robustness than working on normal domain.
*Manuscript
Click here to view linked References
I.
INTRODUCTION
BACK GROUND
RELATED WORK
2.
3.
4.
5.
6.
(4.3.1)
(4.3.2)
237018 vertices
474048 polygons
71101vertices
142214 polygons
Figure 4.2: The point cloud angel 3D object model before and
after compression.
543652 vertices
1087716 polygons
19576416 Byte Size on
disk
108556 vertices
217542 polygons
48485 vertices
14546 vertices
96966 polygons
29088 polygons
Figure 4.6: The point cloud horse 3D object model before and after com
Figure 4.7: The shaded cow 3D object model before and after compress
543652 vertices
1087716 polygons
108556 vertices
217542 polygons
48485 vertices
14546 vertices
96966 polygons
29088 polygons
Angel
Happy
Horse
Cow
Model
Model
Model
Model
0.30000
0.20000
0.30000
0.30000
Edges collapsed
165917
435087
33939
2032
No of final edges
213321
326313
43632
2610
Compression ratio
3.33304
5.05457
3.33343
3.33384
0.69465
0.82077
0.79666
0.76822
0.24736
0.20456
0.00527
0.18737
76.74
191.65
15.35
1.10
*Execution Time
As CPU Time
0.6
Noise Ratio
0.5
0.4
0.3
0.2
0.1
3.5
4
4.5
Compression Ratio
5.5
(5.1)
m
where kl,m is constant,
is the associated
P
Legendre
polynomial.
Therefore,
any
l
spherical function f:
S2R
can
be
expanded as a linear combination of spherical harmonics:
(5.2)
(5.3)
(a) Before
(b) After
(a) Before
(b) After
3:
Generate
the
Spherical
Wavelet
Transformation
Because wavelets have been proved to be powerful
bases for use in signal processing based on they only
require a small number of coefficient to represent general
functions and large data sets. Due to local support in both
spatial domain and frequency domain which are suited
for spare approximation of function, the spherical wavelet
proposed in [19] is chosen in this work and the butterfly
wavelet transformation is selected in particular. The
following is a brief description about the wavelet
transformation in general and butterfly wavelet
transformation in particular:
For general wavelet transformation analysis [19]:
Forward
Wavelet
Transform (Analysis)
j,k = iK(j) j,k,i j+1,i
Scaling
function
coefficient, coarse to fine
(5.4)
j,m= iM(j) j,m,i j+1,i
(5.5)
Inverse Wavelet Transform
(Synthesis)
j+1,i = kK(j) hj,k,i j,k +
mM(j) gj,m,i j,m
Scaling
function
coefficient, coarse to fine
(5.6)
When n,- ( n is finest resolution level) with forward
transform to get j,- the wavelet coefficient current level,
(5.9)
F(vj)=
1/j *
log vj
(5.10)
where vj is all vertices of M and belong to j.
Step 3: Watermark embedding
j+1,k =j,k
(5.11)
j+1,m= j,m + 1/2(j+1,v1+j+1,v2) +
1/8(j+1,f1+j+1,f2 ) - 1/16(j+1,e1+
j+1,e2 +j+1,e3+j+1,e4)
(5.14)
(5.12)
After
10
3D Models
Samples
/ Performance
Metrics
Angel
Model
Happy
Model
Horse
Model
Cow Model
Geometric Error
0.0550
0.0991
0.1912
0.2990
*Time Processing
468.56
703.26
370.83
94.59
Hausdorff distance :
Table 4.1:
The performance measurement of the
watermarking algorithm in this work.
Hmax (M1,M2)
H(M1,M2) = max { MaxaM1 MinbM2 d(a,b) ,MaxaM2
MinbM1 d(a,b) }
(14)
VI.
For
testing
the
watermarking
algorithm
implemented in this thesis, the following attacks were
chosen to attack six 3D models samples:
Data Capacity
(5.15)
for
3D
object
in
bits<=
1.
2.
3.
4.
5.
6.
7.
8.
Translation (x+20,y-5,z-13).
Translation (x-2, y+13, z+5).
Rotation (y- coordination 30).
Rotation (x- coordination 30 and zcoordination 60).
Scale (x-scale 0.6,y-scale 2, z-scale 3).
Scale (x-scale 3, y-scale 0.5, z-scale 0.2).
Smoothing mesh as noise filtering with
regular subdivisions 1:4.
Lossy compression that provided by [22].
3. Processing Time
For this watermarking algorithm, most of the time
consumed was spent on calculating coefficients by
spherical wavelet transformation, the embedded
watermark and extracted watermark dont take a lot of
time comparing with wavelet transformation. There is no
mathematical way to calculate the time processing here
but by experimental results shown in table 4.1, it can be
noticed that the time processing is increasing according to
an increasing number of vertices.
Table 5.1 shows the results that have been achieved
by applying the watermarking algorithm in this thesis on
the six models.
BER = (Counterrors(Bits_in
Number of bits)
Bits_out))/(Total
(6.1)
(6.2)
11
3D Models
Samples
/ BER for
attacks
Angel
Model
Lossy
Compression
0.4888
0.3052
0.4272
0.3709
Translation
(x+20,y-5, z13)
0.0012
0.0014
9.7656e004
6.1035e004
0.0011
0.0017
9.7666e004
6.1075e004
0.0025
5.4932e004
0.0031
0.0018
0.0037
5.4911e004
0.0039
0.0012
0.0018
0.0061
0.0055
0.0049
0.0023
0.0049
0.0061
0.0051
0.1366
0.1831
0.3520
0.2191
Happy
Model
Horse
Model
Cow
Model
Translation
(x-2, y+13,
z+5)
Rotation (y-
D Models Samples
/ Robustness
Metrics against Attacks
Angel
Model
Happy
Model
Horse
Model
Cow coordination
Model 30)
BER
0.0291
0.0535
0.0945
0.1764
SR
0.9709
0.9465
0.9055
BER
0.0018
4.2725e004
0.0015
SR
0.9982
0.9996
0.9985
BER
0.0011
4.2705e004
0.0035
2.4454e-Scale (x004
scale 0.6,y-
SR
0.9989
0.9996
0.9965
0.9998
BER
0.0020
9.7656e004
0.0013
SR
0.9980
0.9990
0.9987
BER
0.0026
9.7436e004
0.0025
SR
0.9974
0.9990
0.9975
regular
0.9997 subdivisions
BER
8.2393e004
4.5746e004
1.5279e004
7.0180e004
SR
0.9992
0.9995
0.9998
0.9993
BER
8.2397e004
4.5776e004
1.5259e004
7.0190e004
0.9995
0.9998
Lossy Compression
Tran2slation
(x+20,y-5, z-13)
Translation
(x-2, y+13, z+5)
scale 2, zscale 3)
Rotation
(y-coordination 30)
Rotation
(x- coordination 30
And
z- coordination 60)
3.0518e-Scale (x004
scale 3, y0.9997 scale 0.5, zscale 0.2)
3.0508e-Smoothing
004
mesh with
1:4
Scale
(x-scale 0.6
,y-scale 2, z-scale 3)
Scale
(x-scale 3,
y-scale 0.5
, z-scale 0.2)
Smoothing
SR
0.9992
BER
0.0183
0.0237
0.0243
SR
0.9817
0.9763
0.9757
12
proposed in [2]
References
[1] R.Ohbuchi, M.Nakazawa and T.Takei,"Retrieving
3D shapes based on their appearance", ACM SIGMM
Workshop on Multimedia Information Retrieval,
Berkeley, California, pp. 3946,2003.
[2] 3D Object Processing: Compression, Indexing and
Watermarking. Edited by J.-L. Dugelay, A. Baskurt and
M. Daoudi, John Wiley & Sons, Ltd. ISBN: 978-0-47006542-6, 2008.
[3] P.Besl and D.Mckay,"A method for registration of 3D
shape", IEEE Transaction on Pattern Analysis and
Machine Intelligence, 14(2):239-256,1992.
Figure 6.2: The experimental result for the work in
this paper.
[4] Chang, A.M. and Hall, L.O. (1992). The validation of fuzzy
VII. CONCLUSIONS
13
AUTHORS
K. F. Khataneh is with the Information Technology
School, Al-Balqa Applied University, Salt-Jordan (e-mail:
khalafk2002@yahoo.com).
N. Abu Romman is with Princess Sumaya University
for Technology (PSUT), Department of Computer
Graphics, Amman-Jordan (e-mail: Nadine@psut.edu.jo).
14