Sensors

© All Rights Reserved

0 tayangan

Sensors

© All Rights Reserved

- Reverse Engineering of Mechanical Devices
- Reducing Paper in Todays Digital World
- NewPediatricProtocolsTutorial_041108
- Rapid Pro to Typing Machine &RE
- Midshire Business Systems - Lexmark XC2132 - MFP Brochure
- Fi 5x50c Replacement Guide
- Calibration Laboratory Notes
- Presentation September 20 2011 2003
- BizhubC252 Brochure
- BJC2000SM
- Reverse Engineering is the Process of Taking a Finished Product and Reconstructing Design Data in a Format From Which New Parts or Molds Can Be Produced
- remotesensing-03-01691
- RA Tech
- Metrology and Measurements
- SM_fax 111.pdf
- Check Your English Vocabulary for Computers and Information Technology
- 103.241.147.158 Ongc ONGC Candidate GATEInstructions
- Bizhub 185 Brosura
- Slides Missing Neighbours[1]
- Computers & Information Technology

Anda di halaman 1dari 13

Article

Development and Verification of a Novel

Robot-Integrated Fringe Projection 3D Scanning

System for Large-Scale Metrology

Hui Du 1,2 , Xiaobo Chen 1,2 , Juntong Xi 1,2,3, *, Chengyi Yu 1,2 and Bao Zhao 1,2

1 School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China;

sdduhui@hotmail.com (H.D.); xiaoboc@sjtu.edu.cn (X.C.); jg5bvictor@sjtu.edu.cn (C.Y.);

zhaobao1988@sjtu.edu.cn (B.Z.)

2 Shanghai Key Laboratory of Advanced Manufacturing Environment, Shanghai 200030, China

3 State Key Laboratory of Mechanical System and Vibration, Shanghai Jiao Tong University,

Shanghai 200240, China

* Correspondence: jtxi@sjtu.edu.cn; Tel.: +86-21-3420-5693

profilometry of these surfaces plays a pivotal role for quality control. This paper proposes a

novel and flexible large-scale 3D scanning system assembled by combining a robot, a binocular

structured light scanner and a laser tracker. The measurement principle and system construction

of the integrated system are introduced. A mathematical model is established for the global data

fusion. Subsequently, a robust method is introduced for the establishment of the end coordinate

system. As for hand-eye calibration, the calibration ball is observed by the scanner and the laser

tracker simultaneously. With this data, the hand-eye relationship is solved, and then an algorithm is

built to get the transformation matrix between the end coordinate system and the world coordinate

system. A validation experiment is designed to verify the proposed algorithms. Firstly, a hand-eye

calibration experiment is implemented and the computation of the transformation matrix is done.

Then a car body rear is measured 22 times in order to verify the global data fusion algorithm. The 3D

shape of the rear is reconstructed successfully. To evaluate the precision of the proposed method,

a metric tool is built and the results are presented.

Keywords: large-scale metrology; robot integrated system; structured light profilometry; hand-eye

calibration; global data fusion

1. Introduction

Large-scale thin wall and surface components are widespread in modern high-end manufacturing

industries, especially in the automotive, shipbuilding, astronautical and aeronautical industry. In these

fields, the surface forming quality usually equals the manufacturing quality of the corresponding

component to some extent, and finally affects the assembly quality to a great degree. Bad performance

in quality control means more pauses, modifications and even failures in the manufacturing process.

This will undoubtedly lengthen the production cycle and the costs will increase for this reason.

Therefore, to meet the precision requirements of manufacturing plants and reduce time/manpower

costs, an automatic, flexible and accurate large-scale 3D measurement method is of great importance

or even indispensable in some application scenarios.

Various methods have been developed in the field of large-scale 3D shape measurement and

a lot of instruments have been introduced for this purpose [1–4]. Among off-the-shelf products,

the Leica T-Scan has a good performance in large range metrology for its high speed and hand-held

Sensors 2017, 17, 2886 2 of 13

property [5]. However, it is not so suitable for on-site automatic inspection which is increasingly

important in intelligent manufacturing. Traditionally, the coordinate measuring machine (CMM) has

been extensively used in 3D shape measurement. For different products of different sizes, different

CMMs are correspondingly developed. The measurement range spans from less than 1 m to several

meters. With the development of visual sensor, computer vision technology and computation ability,

more and more kinds of vision measurement equipment are integrated with CMMs, which are usually

used in contact measurement for the dies, molds and so on [6–8]. However, the biggest drawback of

this kind of method is that only limited types and numbers of products can be sampled and brought

to the CMM for inspection. This means that even for relatively important products, it is difficult

for us to obtain the quality data of all the components, which may lead to possible quality failure.

If the CMM is integrated with the production line, it usually will not perform well in precision and

robustness. Recently, with the improvement in precision manufacturing, robot kinematics and control

engineering, robotic technologies have gone through huge developments. All these factors make robots

increasingly economical and practical in manufacturing industry. Therefore, more and more visual

sensors are integrated into robots to form a more flexible measurement system [9,10]. Furthermore, in

some researches, a turntable surrounding the robot is introduced to extend the metrology range of the

robot system [11,12]. Because it combines robot’s flexibility and accuracy of visual profilometry, it is

promising for online inspection of large-scale parts. Another 3D shape measurement technology widely

used in large volume profilometry is laser scanning [13–15]. The advantage of this kind of method

is that it is easier to implement and more economical. Under the assumption of good calibration,

the accuracy can also be assured. However, the weakness of this method is clear. In principle, laser

scanning can only acquire data along one line or several lines for each measurement. To enhance the

data quantity at every planned measurement position, a movement mechanism should be integrated

with the scanner [16]. Normally a linear rail or a rotary mechanism is necessary to achieve this function.

However, the introduction of the movement mechanism brings movement errors into the system.

To compensate the errors, the movement mechanism should be calibrated, which is also a challenging

task. Compared to the laser scanning method, structured light profilometry [17–21] can acquire the

data on a surface for each measurement. Without the movement mechanism, the accuracy of every

measurement depends only on the calibration of the visual sensors. If only the calibration is designed

and implemented carefully, the accuracy can be assured [22,23]. Additionally, this method acquires

much bigger data than laser scanning, which, compared to the line scanning method, produces more

accurate metrology results. Meanwhile, thanks to the area scanning property, it has a better time

performance. For all these advantages, the structured light scanning method will be promising if it is

used in large-scale metrology. Some work has been made in this direction. Paoli et al. [24] mounted

a binocular structured light scanner at the end of an anthropomorphic robot arm, and mounts the

robot with two linear guides (horizontal and vertical). By building a series of coordinate systems, the

measured data is unified into a fixed coordinate system defined by a total station. This approach works

well in the measurement of a large yacht hull. However, as stated in Paoli’s paper, the position of the

total station must be set carefully to ensure that all the optical reflectors can be observed simultaneously.

To some extent, this limits the flexibility of the system.

Overall, compared to other technologies, structured light scanning is an accurate and efficient

method for 3D metrology. The integration of structured light scanner and robot will dramatically

enlarge the measurement volume. By appropriate hand-eye calibration and coordinate transformation,

a software independent algorithm can be built, which makes large-scale data fusion a less challenging

task. Until now, the study about this kind of system is limited. Therefore, more work should be done

to improve the performance of this type of method.

In this paper, a novel integrated robotic scanning system is proposed for flexible and robust

large-scale measurement. The system is composed of a binocular structured light scanner, a robot

with six degrees of freedom (DOF) and a laser tracker. The structured light scanner is used to get

the surface data in specific measurement positions. An optical target corner cube reflector (CCR) is

Sensors 2017, 17, 2886 3 of 13

fixed on the base of the scanner. The end coordinate system (ECS) is built by rotation of the scanner.

The laser tracker is used to get the center of target ball and finish the data fusion. As for the hand-eye

calibration, different from the traditional method, by observing the target ball using the scanner

Sensors 2017, 17, 2886 3 of 13

and

the end coordinate system, the transformation matrix is computed. After obtaining the transformation

between end coordinate

calibration, system

different from the and world method,

traditional coordinate system (WCS),

by observing all ball

the target the using

data isthe

combined into the

scanner and

samethe end coordinate

coordinate system. Insystem, the the

this way, transformation

data fusion ismatrix

finishedis automatically.

computed. After obtaining the

transformation

The rest of thebetween

paper end coordinateas

is organized system and world

follows: Sectioncoordinate systemthe

2 introduces (WCS), all the

overall data is

measurement

combined into the same coordinate system. In this way, the data fusion is finished

principle and system construction. The building of the end coordinate system and hand-eye calibrationautomatically.

The rest of the paper is organized as follows: Section 2 introduces the overall measurement

algorithm is also stated in this part. Section 3 introduces the results of hand-eye calibration and global

principle and system construction. The building of the end coordinate system and hand-eye

data fusion. A quantitative evaluation is presented in this part. The paper will finish in Section 4 with

calibration algorithm is also stated in this part. Section 3 introduces the results of hand-eye calibration

a short

andconclusion.

global data fusion. A quantitative evaluation is presented in this part. The paper will finish in

Section 4 with a short conclusion.

2. The Proposed Approach

2. The Proposed Approach

2.1. Measurement Model and System Construction

2.1. Measurement

The integrated Model

roboticandscanning

System Construction

system incorporates an industrial robot with six degrees of

freedom (DOF), a laser tracker,

The integrated a digital system

robotic scanning fringe projecting

incorporates(DFP) scanner and

an industrial robota with

CCR sixwhich is mounted

degrees of

freedom

on the scanner.(DOF),

Thea DFP

laser tracker,

scannera digital

is fixedfringe projecting

at the (DFP) scanner

end effector and a CCR

of the robot. When which

theissystem

mounted works,

on thelocates

the robot scanner. The

the DFP scanner

scanner to theisplanned

fixed at the end effector

discrete of theand

positions, robot.

theWhen the system

scanner acquires works,

the 3Dthepoint

cloudrobot locates

of that the scanner region.

corresponding to the planned discrete positions, and the scanner acquires the 3D point

cloud of that corresponding region.

Instead of choosing an off-the-shelf product, the scanner is a binocular DFP equipment developed

Instead of choosing an off-the-shelf product, the scanner is a binocular DFP equipment

according to the implementation circumstances (e.g., work distance, illumination and reflectivity

developed according to the implementation circumstances (e.g., work distance, illumination and

of the surface).ofFor

reflectivity its characteristics

the surface). of high resolution

For its characteristics and low

of high resolution and sensitivity to to

low sensitivity thetheambient

ambient light,

the three-frequency and three-step phase shifting scheme is adopted to encode

light, the three-frequency and three-step phase shifting scheme is adopted to encode the fringe the fringe patterns.

To access the data

patterns. of a region,

To access the datanine patterns

of a region, arepatterns

nine sequentially projected onto

are sequentially the surface

projected onto theby the projector.

surface by

Thenthe

they are acquired

projector. Then theybyare

two cameras

acquired by from different

two cameras directions.

from After phase

different directions. Afterdecoding and image

phase decoding

and image

registration, theregistration, the final

final 3D point data3D point data is

is acquired. Theacquired.

working The workingof

principle principle

the DFP of scanner

the DFP scanner

is illustrated

is illustrated

in Figure 1. in Figure 1.

Captured image

CameraL CameraR

Projector

Figure1.1.Measurement

Figure principleofofthe

Measurement principle the scanner.

scanner.

LikeLike

any any other vision metrology system, the cameras should be previously calibrated. To enhance

other vision metrology system, the cameras should be previously calibrated. To enhance

the accuracy of calibration and measurement, a more accurate calibration method is applied [25]. This

the accuracy of calibration and measurement, a more accurate calibration method is applied [25].

approach acquires the calibration points in a form of grid point array, and gets the calibration results

This by

approach acquires the calibration points in a form of grid point array, and gets the calibration

Zhang’s algorithm [26]. With this approach, high accuracy of calibration and measurement can

results by Zhang’s

be assured. algorithm [26]. With this approach, high accuracy of calibration and measurement

can be assured.

The coordinate system of the integrated system comprises measurement coordinate system

(MCS), ECS and WCS, which is shown in Figure 2. The ECS is defined by the rotation of the scanner.

The WCS is fixed with the laser tracker.

Sensors 2017, 17, 2886 4 of 13

The coordinate system of the integrated system comprises measurement coordinate system (MCS),

ECS and WCS, which is shown in Figure 2. The ECS is defined by the rotation of the scanner. The WCS

Sensors 2017, 17, 2886 4 of 13

is fixed with the laser tracker.

TE-W

Robot CCR

ECS

WCS

TM-E

MCS

scanner

Figure 2. Definition of the coordinate system.

P a point

BeingBeing in the

P a point robot

in the workspace,

robot workspace,the

the mapping relationship

mapping relationship between

between PW inPWCS

coordinate

coordinate W in WCS

and

and PM inPMCS

M in MCS is expressed

is expressed as as follows:

follows:

PW TE W TM E PM (1)

PW = TE−W TM− E PM (1)

TM−E is the transformation relationship between MCS and ECS, TE−W denotes the transformation

TM −E isbetween

matrix ECS and WCS.

the transformation The acquired

relationship data of MCS

between this integrated

and ECS,system

TE −W isdenotes

alignedtheandtransformation

assessed

in the world frame which is defined by the laser tracker.

matrix between ECS and WCS. The acquired data of this integrated system is aligned and assessed in

By combining the area scanning structured light equipment, the laser tracker and the robot, this

the world frame which is defined by the laser tracker.

system reaches a high-level equilibrium in flexibility, velocity and accuracy. With the DFP scanner,

By combining the area scanning structured light equipment, the laser tracker and the robot,

3D shape information of the object will be acquired at one robot position. Through off-line

this system reachesthea whole

programming, high-level

surfaceequilibrium

of the workpiece in flexibility, velocitycan

or specified features and accuracy.InWith

be measured. the DFP

this case,

scanner,

the 3D

robotshape information

is only used to carryofthe the objectand

scanner, willallbethe

acquired

acquiredat one

data is robot

unifiedposition. Through to

to WCS. Compared off-line

programming, the whole

other methods, surface

this scheme of the

avoids theworkpiece or specified

error accumulation features

of multiple can betransformation

coordinate measured. Inand this case,

robotis main

the robot only body

used calibration.

to carry the Therefore,

scanner,high andaccuracy could be data

all the acquired expected. The adoption

is unified to WCS.ofCompared

area

scanning ensures high efficiency and resolution of the system, which is

to other methods, this scheme avoids the error accumulation of multiple coordinate transformation crucial for subsequent data

analysis. For a specified position, the scanning can be finished within 3 s, including the fringe

and robot main body calibration. Therefore, high accuracy could be expected. The adoption of area

projection time. With this integrated system, the complete measurement could be executed in a short

scanning ensures high efficiency and resolution of the system, which is crucial for subsequent data

period and the accuracy could be maintained.

analysis. For a specified

Mounting position,

the scanner andthe

CCRscanning can be finished

on the industrial within

robot, and 3 s, including

putting the fringe

the laser tracker API T3projection

in

time. With

front of the six degree of freedom (DOF) Fanuc robot M-710iC, the integrated 3D scanning system is and

this integrated system, the complete measurement could be executed in a short period

the accuracy could

constructed as be maintained.

illustrated in Figure 3a. The construction of the structured light scanner is shown in

Figure 3b. the scanner and CCR on the industrial robot, and putting the laser tracker API T3 in

Mounting

front of theAssix

shown

degreein Figure 3a, to construct

of freedom (DOF) Fanucthe ECS,robot

a CCRM-710iC,

is set on the scanner.

the When3D

integrated the scanning

system works,system is

the scanner acquires the point cloud first, and then rotates to another two positions.

constructed as illustrated in Figure 3a. The construction of the structured light scanner is shown in Using these three

points, the ECS is built. Obviously, this method reduces the constraints of the relative position

Figure 3b.

between the laser tracker and the reflectors. Actually, only one CCR is used in this method and the

As shown in Figure 3a, to construct the ECS, a CCR is set on the scanner. When the system works,

ECS is constructed by the rotation of the scanner. Therefore, if only the laser tracker is put at an

the scanner acquires

appropriate the point

position from the cloud first,the

scanner, and then

ECS canrotates to another

be smoothly twobypositions.

acquired three timesUsing these three

of rotation.

points,Compared

the ECS istobuilt. Obviously, this method reduces the constraints of the

other methods, this approach avoids most of the occlusion and is relatively more relative position between

the laser tracker

flexible and the reflectors. Actually, only one CCR is used in this method and the ECS is

and robust.

constructed by the rotation of the scanner. Therefore, if only the laser tracker is put at an appropriate

position from the scanner, the ECS can be smoothly acquired by three times of rotation. Compared to

other methods, this approach avoids most of the occlusion and is relatively more flexible and robust.

Sensors 2017, 17, 2886 5 of 13

Sensors 2017, 17, 2886 5 of 13

Scanner

Scanner

CCR

CCR

Robot

Robot

Laser tracker

Laser tracker

(a)

(a)

(b)

(b) system; (b) Binocular structured light scanner.

Figure 3. (a) Construction of the integrated 3D scanning

Figure 3. (a) Construction of the integrated 3D scanning system; (b) Binocular structured light scanner.

Figure 3. (a) Construction of the integrated 3D scanning system; (b) Binocular structured light scanner.

2.2. End Coordinate System Construction

2.2. End Coordinate System Construction

2.2. End Coordinate

As the System

beginning stepConstruction

of building the global data fusion algorithm, the ECS should be previously

As the beginning

constructed. step of

The robot’s J6 building thescanner

axis and the global aredata fusion

used algorithm,

to implement the

this ECSThe

work. should

CCR be previously

is put on

As the beginning step of building the global data fusion algorithm, the ECS should be previously

the basement

constructed. The of the scanner.

robot’s J6 axis When

and thescanner

the system areworks,

usedthetoscanner is positioned

implement this to a The

work. planned

CCR point

is put on

constructed. The robot’s J6 axis and the scanner are used to implement this work. The CCR is put on

and its position

the basement of the isscanner.

acquired When

by the laser

the tracker. works,

system Then fringes

the are projected

scanner is to acquire to

positioned theapoint cloudpoint

planned

the basement of the scanner. When the system works, the scanner is positioned to a planned point

of position

and its the corresponding

is acquired surface

bybyarea.

the After that, J6Thenaxis rotates

fringesfor another two times, and thetheposition

and its position is acquired thelaser

lasertracker.

tracker. Then fringes are

are projected

projected to acquire

to acquire the point point

cloudcloud

is recorded by the laser tracker. Finally, after rotation for three times, three points (P1, P2, P3) are

of theofcorresponding

the corresponding surface area.

surface area.After

Afterthat,

that, J6

J6 axis rotatesfor

axis rotates foranother

another twotwo times,

times, andand the position

the position

recorded. The first point P1 can be taken as the origin point. Connecting P1 and P2 into a straight line,

is recorded

is recorded by the

by the laserlaser tracker.Finally,

tracker. Finally,after

after rotation

rotation for

forthree

threetimes,

times, three points

three (P1,(P1,

points P2, P3)

P2,areP3) are

X-axis is in the same direction. Z-axis is built by vector cross product. Y-axis can be obtained by the

recorded.

recorded. The The

first first

pointpoint

P1 P1 can

can be be taken

taken as

as the

the origin

origin point.

point. Connecting

Connecting P1 and

P1 P2

and into

P2 a straight

into a line, line,

straight

same method. In this way, the ECS is constructed. This process is explained in Figure 4.

X-axisX-axis

is in isthe

in same

the same direction.

direction. Z-axisisisbuilt

Z-axis built by

by vector

vector cross

crossproduct.

product. Y-axis cancan

Y-axis be obtained

be obtainedby the by the

same method. In this way, the ECS is constructed. This process is explained in Figure 4.

same method. In this way, the ECS is constructed. This process is explained in Figure 4.

z

z

P1

P1 y

y

P2

xP2 P3

x P3

Figure 4. Construction of ECS.

Figure4.4. Construction

Figure Construction ofofECS.

ECS.

Sensors 2017, 17, 2886 6 of 13

Sensors 2017, 17, 2886 6 of 13

2.3.

2.3. Hand-Eye

Hand-Eye Calibration

Calibration

In

In classic

classichand-eye

hand-eyecalibration

calibration algorithm,

algorithm, to acquire

to acquirethe hand-eye

the hand-eye transformation

transformation matrix, the robot

matrix, the

should

robot should take the eye to several different positions and observe the same calibration rig. The robot

take the eye to several different positions and observe the same calibration rig. The robot

kinematics

kinematics parameters

parameters are are used

used toto solve

solve the

the transformation

transformation matrix.

matrix. Different

Different fromfrom the

the traditional

traditional

method, in the proposed method, the robot is only used as an orienting

method, in the proposed method, the robot is only used as an orienting device. It is unnecessary device. It is unnecessary for for

us

to apply robot’s kinematics parameters and the kinematics error can

us to apply robot’s kinematics parameters and the kinematics error can be bypassed. This benefitsbe bypassed. This benefits the

improvement

the improvement of calibration accuracy.

of calibration accuracy.

In

In the proposed method, a CCR

the proposed method, a CCR is is used

used asas the

the calibration

calibration target

target ball.

ball. In

In the

the calibration

calibration process,

process,

the ball is measured by the structured light scanner and laser tracker

the ball is measured by the structured light scanner and laser tracker simultaneously simultaneously (Figure(Figure

5). Firstly,

5).

to acquire the center of target ball, the scanner is used to get the point cloud data.

Firstly, to acquire the center of target ball, the scanner is used to get the point cloud data. This data is This data is used

to obtain the ball as Xas i . For

𝑖 ease of use, it should be

used to obtain thecenter in MCS,

ball center and this

in MCS, and center can be

this center candenoted

be denoted M 𝑋𝑀 . For ease of use, it should

saved

be savedin the form

in the of homogeneous

form of homogeneous coordinate.

coordinate.At theAtsame time,time,

the same the CCR ball isball

the CCR measured by thebylaser

is measured the

tracker. Actually, based on the principle of the laser tracker, the ball center can

laser tracker. Actually, based on the principle of the laser tracker, the ball center can be acquired in be acquired in this way.

It can

this be denoted

way. It can beas Ci . To transform

denoted Ci into ECS,

as Ci. To transform the ECS,

Ci into ECS isthefirst

ECSbuilt as stated

is first built asinstated

Sectionin 2.2.

SectionTaking

2.2.

P1 as the

Taking P1origin

as thepoint,

originthe coordinates

point, can be can

the coordinates acquired by projecting

be acquired P1 Ci onto

vectorvector

by projecting P1Ci the

ontothree ECS

the three

axes. This new

ECS axes. Thiscoordinate of Ci canofbeCdenoted

new coordinate i can beas XEi , which

denoted as is𝑋𝐸𝑖also transformed

, which is also to the homogeneous

transformed to the

coordinate form. By putting the CCR at several different positions

homogeneous coordinate form. By putting the CCR at several different positions in the in the scanner vision field,

scanner two

vision

homogeneous coordinatecoordinate

field, two homogeneous vector groups vectoraregroups

constructed, which are shown

are constructed, which are as follows:

shown as follows:

M XM XiMMi · · ·

h i

1 1 2X M2 3 3

X MX = X MX M XX

MM · · · X (2)

(2)

h i

11 2 2 3 3 ii

XEX E = XX X X · · · XEE

X · · · (3)

(3)

EE XE E XE E

To solve the transformation matrix between the scanner and ECS, an equation is built as follows:

To solve the transformation matrix between the scanner and ECS, an equation is built as follows:

X M = TE− M XE (4)

X M TE M X E (4)

where TTEE−M

where −Misisthe

thehand-eye

hand-eyetransformation

transformationmatrix.

matrix.ItItcan

canbe bewritten

writteninto

intothe

thefollowing

followingform:

form:

RR T33×1

" #

333 T

M

3×

11

E =

TE−TM (5)

(5)

00

1×

133 444×4

(R T)

MCS ECS

Xi M XiE

C1 C2 Ci

Figure 5.

Figure 5. Hand-eye

Hand-eye calibration.

calibration.

In this matrix, R means the rotation matrix, and T the translation vector. According to the

In this matrix, R means the rotation matrix, and T the translation vector. According to the property

property of rotation matrix, it exists the following constraint:

of rotation matrix, it exists the following constraint:

RTT R I (6)

R R=I (6)

In this way, the computation of the transformation matrix can be transformed into an

optimization problem with a constraint:

min X M TE M X E

2

F (7)

s.t RT R I 33

Sensors 2017, 17, 2886 7 of 13

In this way, the computation of the transformation matrix can be transformed into an optimization

problem with a constraint: (

mink X M − TE− M XE k2 F

(7)

s.t R T R = I3×3

orthogonal-force-consistency problem, which can be solved by single value decomposition.

For the structured light profilometry, the data on a surface area is acquired in every scanning.

Therefore, to get the entire data of a large-scale component, the measurement should be implemented

for a lot of times according to the size. In this process, the scanner is carried to different positions

by robot. The surface point cloud data at every position can be obtained. To combine all the data,

the position and pose of the end-effector should be tracked.

For every measurement position, an {ECS}i is built by tracking the CCR ball. Let TEi −W be the

transformation between {ECS}i and WCS. TEi −W can be written into the following form:

nx ox px tx

ny oy py ty

TEi −W = (8)

nz oz pz tz

0 0 0 1

In this matrix, (nx ny nz )T , (ox oy oz )T , (px py pz )T are the corresponding unit vectors of {ECS}i

coordinate axes in WCS. (tx ty tz )T is the position of the origin point of {ECS}i in WCS. Until now,

both the hand-eye transformation matrix TE-M and the transformation matrix between {ECS}i and WCS

(TEi −W ) have been obtained. To combine all the data, the following equation can be used:

i i

TM −W = TE− M TE−W (9)

Here TMi

−W is the transformation matrix between MCS and WCS. With this equation, all the

acquired data can be unified into WCS, and the data fusion can be finished automatically.

3. Results

To verify the effect of proposed methodologies, several experiments are designed and

implemented. Through hand-eye calibration experiment, the transformation matrix between MCS and

ECS is computed. Based on this relationship, the global data fusion experiment is executed and the 3D

shape of a car body rear is acquired. To the end of quantitative assessment, a metric tool is constructed

and the evaluation results are demonstrated.

The hand-eye calibration algorithm has been introduced in Section 2.2. According to the algorithm,

the experiment is designed, which is shown in Figure 6. In the calibration process, the scanner and

robot should be kept still. The target ball is put on 15 different positions in the scanner vision field

range. For every position, the ball is measured by the scanner and the laser tracker simultaneously.

After this, the ball is set on the scanner. After three times’ rotation, the ECS is constructed. The data is

shown in Table 1.

Sensors 2017, 17, 2886 8 of 13

Sensors 2017, 17, 2886 8 of 13

CCR

Figure6.6.Hand-eye

Figure Hand-eye calibration

calibration experiment.

experiment.

Table 1. Hand-eye calibration data.

WCS ECS MCS

X WCSY Z X Y

ECS Z X Y MCS Z

No.1 −603.234 −570.824 −623.064 −654.862 −154.646 −104.806 289.155 88.323 605.415

No.2 X−527.529 −541.327

Y Z

−623.513 X

−667.577 Y

−167.702 Z

−25.628 X 9.639 Y 622.489 Z

285.935

No.1 No.3 −440.814 −505.516 −621.711

−603.234 −570.824 −623.064 −679.309

−654.862 −183.062

−154.646 −104.806 289.155−81.623

66.197 280.02 639.968605.415

88.323

No.2 No.4 −456.599−541.327

−527.529 −455.086 −623.513

−621.325 −−649.743

667.577 −226.843

− 167.702 67.418

− 25.628 227.691

285.935−83.1199.639

640.253622.489

No.3 No.5 −540.046−505.516

−440.814 −475.961 −621.711

−622.876 −−631.899

679.309 −223.167

− 183.062 −16.662

66.197 220.646

280.02 0.517 624.339639.968

−81.623

No.4 No.6 −634.259−455.086

−456.599 −513.428 −621.325

−622.370 −−616.476

649.743 −206.324

− 226.843 −115.448

67.418 225.462

227.69198.891 602.866640.253

−83.119

No.5 No.7 −657.153 −469.599 −621.906

−540.046 −475.961 −622.876 −−587.494

631.899 −245.701

− 223.167 −122.851 176.813

−16.662 220.646 106.193 601.124624.339

0.517

No.6 No.8 −581.189−513.428

−634.259 −435.511 −622.370

−622.558 −−598.298

616.476 −262.613

− 206.324 −42.041

−115.448169.253

225.46225.72398.891

618.982602.866

No.7 No.9 −476.216−469.599

−657.153 −395.233 −621.906

−620.951 −−614.414

587.494 −278.986

− 245.701 68.033

− 122.851165.075

176.81349.782 551.496601.124

106.193

No.8 No.10 −494.938 −344.801 −620.797

−581.189 −435.511 −622.558 −−584.017

598.298 −323.345

− 262.613 66.487 111.551

−42.041 169.253 −82.539 640.278618.982

25.723

No.9 No.11 −577.894 −369.441 −621.975

−476.216 −395.233 −620.951 −−567.816

614.414 −316.284

− 278.986 −18.234

68.033 107.993

165.075 1.811 623.791551.496

49.782

No.10 No.12 −671.393−344.801

−494.938 −671.393 −620.797

−621.606 −−549.986

584.017 −304.248

− 323.345 −114.556

66.487 107.749

111.55197.758 603.093640.278

−82.539

No.11 No.13 −690.162−369.441

−577.894 −353.111 −621.975

−621.409 −−520.717

567.816 −346.523

− 316.284 −116.918

−18.234 56.707

107.99399.8611.811602.826623.791

No.12 No.14 −618.261 −325.513 −621.534

−671.393 −671.393 −621.606 −−532.745

549.986 −358.414

− 304.248 −41.782 53.710

−114.556 107.749 25.136 618.746603.093

97.758

No.13 No.15 −513.567 −284.161 −620.443

−690.162 −353.111 −621.409 −−548.676

520.717 −375.996

− 346.523 68.258 48.626

−116.918 56.707 −84.303 640.652602.826

99.861

No.14 −618.261 −325.513 −621.534 −532.745 −358.414 −41.782 53.710 25.136 618.746

No.15 With this data,

−513.567 and by using

−284.161 the algorithm

−620.443 proposed

−548.676 in Section

−375.996 2.2, TE−M

68.258 is solved−finally,

48.626 84.303 which

640.652

is shown as follows:

algorithm0.0051 0.8262

proposed 2.2, TE−M is solved finally, which

in Section8.2035

is shown as follows: 0.8262 0.0065 0.5633 53.9117

TE M (10)

−0.5634

0.0025 10.0051

.0000 −0.0079

0.8262 11.7849

8.2035

0.8262

0.0000 .0000 −00.5633

00.0065 .0000 .0000

−153.9117

TE− M = (10)

0.0025 −1.0000 −0.0079 −11.7849

3.2. GlobalAfter

Datathe calibration of the hand-eye relationship, a car body rear with a size of 1400 mm × 500 mm

Fusion

× 400 mm was measured to verify the proposed scheme. The experiment system is illustrated in Figure 3.

After the calibration

According of the hand-eye

to the path planning results, therelationship, a carbybody

scanner is carried rear to

the robot with 1400 mm ×

a size ofpositions.

22 different

500 mm × 400 mm was measured to verify the proposed scheme. The experiment system is illustrated

in Figure 3. According to the path planning results, the scanner is carried by the robot to 22 different

positions. For every position, corresponding surface data is acquired by the structured light scanner.

Figure 7 shows the point cloud data in a form of triangular meshes representation.

SensorsSensors

2017, 17, 2886

2017, 17, 2886 9 of 139 of 13

Sensors 2017, 17, 2886 9 of 13

For every position, corresponding surface data is acquired by the structured light scanner. Figure 7

Simultaneously,

shows

For everythe position,

point cloud{ECS}datai is

correspondingin constructed

a form by is

of triangular

surface data tracking

meshes the

acquired CCR

representation.

by the ball. According

structured light scanner. to Figure

the method

7

shows

proposed Simultaneously,

inthe point cloud

Section i

2.3, T{ECS}

data i isa form

in

—the constructed

of by tracking

triangular

transformation meshes

matrix the CCR ball.

representation.

between {ECS} According

and to

WCS—is the method

constructed.

E −W i

𝑖

proposed

The hand-eye in Section 2.3,

Simultaneously,

transformation 𝑇𝐸−𝑊

{ECS} —the

is transformation

constructed

i matrix matrix

by tracking

was presented between

by the CCR {ECS}

Formula ball.

(9) in and WCS—is

iAccording

Section to constructed.

3.1. the method the

Therefore,

𝑖

i The

TM−W —the hand-eye

proposed in transformation

Section 2.3,

transformation matrix 𝑇 𝐸−𝑊 matrix

—the was presented

transformation by

matrix Formula

between (9)

{ECS}in iSection

and

between MCS and WCS—can be built according to the algorithm 3.1.

WCS—is Therefore, the

constructed.

𝑖

𝑇

The

𝑀−𝑊 —the

hand-eye transformation

transformation matrix between

matrix was MCS

presentedand WCS—can

by Formula be built

(9) in according

Section to

3.1. the algorithm

Therefore, the

proposed𝑖

in Section 2.3. With this data, the point cloud data in each position can be transformed

𝑇𝑀−𝑊

proposed—the in transformation

Section 2.3. With this data,

matrix betweenthe point

MCScloud data in each

and WCS—can beposition

built accordingcan be transformed

to the algorithminto

into WCS. The global data fusion is implemented automatically. Figure 8 shows the multicolor

WCS.

proposed Thein global

Sectiondata fusion

2.3. With thisisdata,

implemented

the point cloud automatically.

data in eachFigure

position8 can shows the multicolor

be transformed into

representation

representationof point

WCS. The global

cloud

of point

data cloud

data atiseach

fusiondata

measurement

at each

implemented measurement position (Figure

position

automatically. (Figure

Figure

8a)

8 8a)

and

shows

thethe

and triangular

the meshes

triangular

multicolor

representation

meshes

representation (Figure

representation 8b) (Figure

of point of the holistic

cloud 8b) of

data carholistic

at the

each rear surface.

car rearItposition

measurement is illustrated

surface. It that8a)there

is illustrated

(Figure andthatexist

the overlapping

there exist

triangular

areasoverlapping

between

meshes adjacent

representation scans.

areas between These

(Figure 8b)overlapping

adjacent ofscans. Theseareas

the holistic car have

overlapping been

areas

rear surface. usedhave

It istoillustrated

evaluate

been used the fusion

to evaluate

that accuracy.

the

there exist

By a fusion

properaccuracy.

overlappingpathareas

planning,

Bybetween

a proper the percentage

path

adjacent planning, ofthethe

scans. These overlapping

percentage

overlapping thearea

of areas is set

overlapping

have beentoused10%istoto

area set40%, which

to 10%

evaluate to

the is

enough40%,

fusion which is

for accuracy. enough

precisionBy for precision

a proper pathIn

computation. computation.

planning, In the

the percentage

the fusional fusional

data, aboutof the data, about

20 overlapping

million points 20 million

areaare

is set points are

to 10% to

acquired, which

acquired,

40%, which

is redundant which

for is redundant

isaccuracy

enough for for accuracy

precision

evaluation. evaluation.

computation.

Therefore, In athe

by Therefore,

fusional data,

resampling by aabout

resampling

algorithm, 20 the algorithm,

million

number the

pointshas

are been

number

acquired,

reduced has

to about been

which reduced

is redundant

2 million. to about 2 million.evaluation. Therefore, by a resampling algorithm, the

for accuracy

number has been reduced to about 2 million.

Figure

Figure 7. Surface

7. Surface point

point clouds

clouds ofof2222times

timesmeasurement

measurement represented

representedinin

a form of triangular

a form meshes.

of triangular meshes.

Figure 7. Surface point clouds of 22 times measurement represented in a form of triangular meshes.

A simple visual inspection can be used to assess the alignment accuracy even without proper

A simple

metric

A tool.

visual inspection

Thevisual

simple stripped can

patterns

inspection onbe

can

used

the to assess

triangular

be used meshes

to assess

thealignment

alignment

the surface

accuracy

represent

accuracytheeveneven without

misalignment proper

errors

without proper

metric tool.tool.

between

metric The stripped

overlapping

The patterns

areas

stripped ononthe

of different

patterns triangular

thepoint meshes

cloudsmeshes

triangular surface

(Figuresurface represent

8b). represent thethe misalignment

misalignment errorserrors

between overlapping

Although

between the areas

overlappingvisual of

areas different

assessment

of different point

can clouds

afford

point (Figure

a qualitative

clouds 8b).

evaluation for the alignment precision,

(Figure 8b).

Although

the result the the

Although visual

cannot assessment

bevisual

considered can

canafford

as exhaustive.

assessment affordToa acquire

qualitative evaluation

the evaluation

qualitative quantitative for

thethe

result

for alignment

of the precision,

misalignment

alignment precision,

error,

the result the

the result proximity

cannot

cannot between

be considered

be considered the

asasoverlapping

exhaustive. To

exhaustive. areas (Figure

To acquire

acquire the 9) of different

thequantitative

quantitative point

result clouds

of the

result is computed.

misalignment

of the misalignment

error,error, the proximity

the proximity between

between thethe overlappingareas

overlapping areas (Figure

(Figure 9)

9)ofofdifferent

differentpoint

pointclouds is computed.

clouds is computed.

(a)

(a)

Figure 8. Cont.

Sensors 2017, 17, 2886 10 of 13

Sensors2017,

Sensors 17,2886

2017,17, 2886 10ofof13

10 13

Sensors 2017, 17, 2886 10 of 13

(b)

(b)

(b)

Figure 8. (a)

Figure Point

8. (a) cloud

Point clouddata

datafusion

fusion result; (b)triangular

result; (b) triangular meshes

meshes representation.

representation.

(a)Point

Figure8.8.(a)

Figure Pointcloud

clouddata

datafusion

fusionresult;

result;(b)

(b)triangular

triangularmeshes

meshesrepresentation.

representation.

Figure

Figure 9. 9. Overlapping areas areas between

between all all the

the aligned

aligned pointpoint clouds.

clouds.

A metric tool has been

Figure developedareas

9. Overlapping to compute

between thealltranslation

the alignedand point rotation

clouds. error. Compared to

the perpendicular directions (x and y direction), the misalignment error along the optical scanner

A metric

A metrictool toolhashasbeen beendeveloped

developedtoto compute

compute thethe translation

translation andand rotation

rotation error.error. Compared

Compared to

to the

Aviewing

metric direction

tool has (z direction)

been developeddominatesto [24]. For the

compute mosttranslation

of the car rear and surface,

rotation the error.

curvature is low. to

Compared

the perpendicular

perpendicular directions

directions (x and(xy and y direction),

direction), the misalignment

the misalignment error along errorthe along

optical thescanner

optical viewing

scanner

the Therefore, the directions

perpendicular error along (x theand

z direction

y For is the most

direction), the significant

misalignment for alignment

error precision

along the isevaluation.

optical scanner

viewing

direction direction

(z direction) (z direction)

dominates dominates

[24]. [24].

most For

of themostcar of

rearthe car

surface,

The translation error is defined as a projection of the distance between the nearest points to the

rear thesurface,

curvature the curvature

low. is low.

Therefore,

viewing

Therefore, direction

the (z

error direction)

theofz along dominates

the plane.

z direction [24].

is the For most of the car rear surface, the curvature is low.

the error

normal along vector direction

the fitting is the Asmostshown in most

significantFigure significant

for10,alignment

given two forprecision

alignment

different precision

evaluation.

point clouds (PC1 evaluation.

and

Therefore,

The

The the error

translation along

error the

is z direction

defined is the

as aa projectionmost

projection significant

of the

the distancefor alignment

between precision

the Then

nearest evaluation.

points to to the

the

PC2),translation

for each point error set, is

thedefined as

mean distance (dm) between of alldistance

the pointsbetween

are acquired. the nearestthe points

distance

The

normal

normal translation

vector

vector

is used of the

of the

to get error

fitting

fitting

a radius is defined

plane.

plane.

which as

As a projection

shown in Figure

can be used to define a circle. of the

10, distance

given

10, given twobetween

different

two different

With these points, andthe nearest

point

point points

clouds

by aclouds

least-square (PC1to

(PC1 andthe

and

normal

PC2),plane

PC2), vector

for

for each

each of

fitting the

point fitting

estimation

point set, theplane.

set, the As

mean distance

computation,

mean shown

distance (dm)

the(dm) in Figure

normal between 10,all

vector

between given

all

(n1,the

the two

anddifferent

points

n2)points arebest

the

are point plane

acquired.

fitting

acquired. clouds

Then(π1,

Then the(PC1 π2) and

distance

distance

PC2), for

canto

is used

is used each

beget

to get point

computed

aa radiusset,

radius the

[27].which

which mean

Thencan can distance

the nearest

be used

be used (dm)

point between

pairs are

to define

to define all

searched

aa circle. the

circle. With points

and these

With the

theseare acquired.

distance

points,

points, between Then

and by

and bythesethe distance

points

aa least-square

least-square

isplane

usedareto get

acquired.a radius

In thiswhich

way, can

the be used

distance d to

fromdefine

C1 toa circle.

PC2 is With

defined these

as 𝑑 = ̅̅̅̅̅̅

|𝐶

points,

1 𝐶 2 |𝑐𝑜𝑠∠𝐸𝐶

and by1 𝐶

a 2least-square

.

plane fitting estimation computation, the normal vector (n1, n2) and the best fitting plane (π1,(π1,

fitting estimation computation, the normal vector (n1, n2) and the best fitting plane π2)

π2) can

plane fittingTheestimation

rotation error is defined as the

computation, the angle

normal value between

vector (n1, the unit

n2) andvectors

the n1 and

best n2. Byplane

fitting traversing(π1, π2)

cancomputed

be be computed [27].[27].

ThenThen the nearest

the nearest point point

pairspairs are searched

are searched and and the distance

the distance betweenbetweenthese these

pointspoints

are

can all the points[27]. in point cloud, the translation and rotation error computation is ultimately

̅̅̅̅̅̅ finished. As

arebe computed Then thedistance

nearest pointC1pairs are searched and

as dthe as distance

between these points

acquired.

acquired. In this In this

way,way, the

the distance d from d from toC1 PC2to PC2

is is defined

defined = 𝑑C1=C2|𝐶 cos𝐶

1 2 ∠ |𝑐𝑜𝑠∠𝐸𝐶

EC C . 𝐶

1 2 .

stated inIn [24], theway,

accuracy of the least squareC1 fitting algorithm significantly ̅̅̅̅̅̅

𝑑 = depends

|𝐶 𝐶2and on1the2 radius

|𝑐𝑜𝑠∠𝐸𝐶 . r,

are acquired.

The

The rotation

rotation thiserror

error is

is the

defined

defineddistance as

as thed angle

the from

angle valueto PC2

value betweenis defined

between the

the unit

unitas vectors

vectors 1n1

n1 and n2.

n2. 1 𝐶2traversing

By

By traversing

which can only be estimated by empirical analyses [28]. In the presented case, the value is defined as

Thepoints

all the rotation error iscloud,

defined theas the angleand value between thecomputation

unit vectors n1 and n2. Byfinished.

traversing

all r = 6 dm. in in point

pointcloud, the translation

translation and rotation

rotation error

error computation is ultimately

is ultimately finished. As

all

As the

stated points

stated inthe

in [24], point

in [24], the cloud, of

accuracy

accuracythe translation

the

of least

the andfitting

square

least square rotation error computation

fittingalgorithm

algorithm is depends

significantly

significantly ultimately

depends ononfinished.

the Asr,

the radius

stated

which in

which can [24],

can onlythe

only beaccuracy of

be estimated

estimated bythe least square

by empirical fitting

empirical analyses algorithm

analyses [28].

[28]. In the

In significantly

the presented depends

presented case,

case, the on

the value the

value is defined r,

radius

is defined as

as

n2

which can only be estimated by π2 PC2 analyses [28]. In the presented case, the value is defined as

empirical

rr == 6 dm.

r = 6 dm. C2 E

n1

nd2

π2 PC2

C1

n2

π PCπ21

2

PC1 C2 E

r Cn21 E

d

n1 C1 d

π1 C1

Figureπ10. Definition of the distance and angle error.

1

PC1

PC1 r

r

Figure

Figure10.

10.Definition

Definitionofofthe

thedistance

distanceand

andangle

angleerror.

error.

Sensors 2017, 17, 2886 11 of 13

Sensors 2017, 17, 2886 11 of 13

By using the metric tool, the translation and rotation errors are computed, and Figure 11 shows

the results.

results. In

In this

thisFigure,

Figure,the

thehorizontal

horizontalaxis

axisrepresents

representsthethe error

error value

value and and

thethe vertical

vertical axisaxis denotes

denotes the

the percentage

percentage of the

of the corresponding

corresponding error.

error. It isItillustrated

is illustrated in Figure

in Figure 11a11a that,

that, forfor most

most ofof the

the points

points (witha

(with

a percentageofof88.53%),

percentage 88.53%),the

thedistance

distanceisisless

lessthan

than0.60.6mm.

mm.IfIfthe

thethreshold

threshold value

value isis set

set to

to 1 mm,

mm, almost

all the

thepoints

points(97.76%)

(97.76%)arearecomprised.

comprised.A similar

A similar situation occurs

situation for the

occurs forrotation error error

the rotation (Figure 11b). Most

(Figure 11b).

of

Mosttheoferror valuevalue

the error is less thanthan

is less 10 10

degrees

degrees (97.21%). ToTo

(97.21%). demonstrate

demonstratethe theerrors

errorsmore

moreclearly,

clearly, the

maximum value (max), minimum value (min), mean value (µ (µ)) and standard deviation value (σ) are

also summarized in Table 2. With these quantitative statistic results, the quality of the data fusion can

be assessed objectively.

objectively.

(a) (b)

Figure

Figure 11.

11. (a)

(a) Distribution

Distribution of

of the

the translation

translation error;

error; (b)

(b) distribution

distribution of

of the

the angle error.

angle error.

Table 2. Statistic of the distance and angle error.

max min 𝛍 𝛔

max min µ σ

d (mm) 1.5081 0 0.2965 0.2465

d (mm)

A (deg.) 1.5081

20.0841 0

0.0029 0.2965 2.6185

2.8333 0.2465

A (deg.) 20.0841 0.0029 2.8333 2.6185

4. Conclusions

4. Conclusions

This paper presents an integrated system for large-scale component profilometry. In this system,

This paper

a structured presents

light scanneranisintegrated systemsurface

built to acquire for large-scale component

point cloud data atprofilometry.

each position. InThe

this robot

system,is

a structured

only used aslight scanner is

an orienting built in

device to acquire surfaceBy

large volume. point cloud data

establishing theattransformation

each position. The robot is

relationship

only usedmeasurement

between as an orienting device in

coordinate large (MCS)

system volume. andByworld

establishing

coordinatethe system

transformation

(WCS), all relationship

the data is

combined into WCS which is defined by laser tracker. For this system, the constructionall

between measurement coordinate system (MCS) and world coordinate system (WCS), of the

the data

end

is combined into WCS which is defined by laser tracker. For this system,

coordinate system (ECS) plays a pivotal role. Here the CCR is mounted on the base of the scanner. the construction of the

end coordinate system (ECS) plays a pivotal role. Here the CCR is

After three times’ rotation, the ECS is constructed. Additionally, different from classic hand-eye mounted on the base of the

scanner. After three times’ rotation, the ECS is constructed.

calibration method, in this scheme, the hand-eye transformation matrix is computed by a Additionally, different from classic

hand-eye calibration

synchronized observationmethod, in this

of the scheme,

scanner andthe hand-eye

laser tracker.transformation

This approachmatrix makes is the

computed

hand-eye by

a synchronized observation of the scanner and laser tracker. This approach

calibration independent from robot kinematics parameters, which makes the calibration more robust makes the hand-eye

calibration

and easier to independent

be implemented. from robot kinematics

An algorithm parameters,

is also built to which makes

solve the the calibration

transformation more

matrix robust

between

and easier to be implemented. An algorithm is also built to solve the

ECS and WCS. In this way, all the data can be automatically combined to the unified coordinate transformation matrix between

ECS andTo

system. WCS. In this

verify the way,

effectallofthe

thedata can be automatically

proposed combined toexperiments

method, corresponding the unified coordinate

are designed system.

and

To verify the effect of the proposed method, corresponding experiments

conducted. With this data, the transformation relationship between MCS and WCS is computed. are designed and conducted.

With

Finally,thisalldata, the transformation

the data is combined into relationship between MCS

the same coordinate andand

system, WCS theisshape

computed. Finally,

of a car body all

rearthe

is

data is combined into the same coordinate system, and the shape of

reconstructed successfully. To evaluate the precision of the proposed method, a metric tool is a car body rear is reconstructed

successfully.

developed andTothe

evaluate

accuracy thedata

precision of the proposed

is presented. method,

The translation a metric

error is lesstool is 0.6

than developed

mm for andmosttheof

accuracy data is presented. The translation error is less than 0.6 mm

the points (88.53%). A mean/maximum value of 0.2965/1.5081 mm is detected in the work volume. for most of the points (88.53%).

A

Themean/maximum

standard deviation valueisof0.2465

0.2965/1.5081

mm. Formm is detected

rotation error,inthethemean

workand volume. The standard

maximum deviation

value are 2.8333

is 0.2465 mm. For rotation error, the mean and maximum value are

and 20.0841 degrees respectively. The standard deviation of the rotation error is 2.6185 degrees. 2.8333 and 20.0841 degrees

respectively.

The mean Thevalue

standard deviationdeviation

and standard of the rotation error is 2.6185

demonstrate that thedegrees.

integrated system exhibits good

accuracy which is comparable to accuracy of the existing system [16,24]. It is believed that the proposed

scheme is of relatively high-efficiency and easy to be implemented. It is quite suitable for the

measurement of large-scale components, such as car bodies, ship plates and astronautical/aeronautical

Sensors 2017, 17, 2886 12 of 13

The mean value and standard deviation demonstrate that the integrated system exhibits good

accuracy which is comparable to accuracy of the existing system [16,24]. It is believed that the

proposed scheme is of relatively high-efficiency and easy to be implemented. It is quite suitable for the

measurement of large-scale components, such as car bodies, ship plates and astronautical/aeronautical

large-scale thin wall components. Future work will focus on more intelligent path planning algorithm

and the improvement of measuring accuracy.

Acknowledgments: This work was supported by the National Basic Research Program of China (973 Program,

No. 2014CB046604); National Science and Technology Major Projects of—Numerical Control Machine Tool

and Based Manufacturing Equipment with High Range (04 Special Program, No. 2014ZX04015021); National

Science and Technology Major Projects-Key and Common Technology in Ship Intelligent Manufacturing

(No. 17GFB-ZB02-194); National Natural Science Foundation of China (No. 51575354); Interdisciplinary Program

of Shanghai Jiao Tong University (No. YG2014MS04, No. YG2015MS09). The authors would like to express their

sincere appreciation to them. Comments from the reviewers and the editors are very much appreciated.

Author Contributions: Hui Du designed the principle of this novel system, built the experiment system

environment, finished the verification experiment/data processing and wrote the paper. Xiaobo Chen contributed

to the access of point cloud, including the design of the fringe projection scanner and camera calibration.

Juntong Xi provided the necessary experiment environment (e.g., robot, laser tracker and so on) and the project

support. Chengyi Yu helps to design the principle. Bao Zhao helps to build the metric tool.

Conflicts of Interest: The authors declare no conflict of interest.

References

1. Chen, F.; Brown, G.M. Overview of 3-D shape measurement using optical methods. Opt. Eng. 2000, 39, 10–22.

2. Mendikute, A.; Yagüefabra, J.A.; Zatarain, M.; Bertelsen, Á.; Leizea, I. Self-calibrated in-process

photogrammetry for large raw part measurement and alignment before machining. Sensors 2017, 17, 2066.

[CrossRef] [PubMed]

3. Sun, B.; Zhu, J.; Yang, L.; Yang, S.; Guo, Y. Sensor for in-motion continuous 3D shape measurement based on

dual line-scan cameras. Sensors 2016, 16, 1949. [CrossRef] [PubMed]

4. Jin, Z.; Yu, C.; Li, J.; Ke, Y. Configuration analysis of the ers points in large-volume metrology system. Sensors

2015, 15, 24397–24408. [CrossRef] [PubMed]

5. Hexagon T-Scan. Available online: http://www.hexagonmi.com/products/3d-laser-scanners/leica-tscan-5

(accessed on 7 December 2017).

6. Kosarevsky, S. Practical way to measure large-scale 2D parts using repositioning on coordinate-measuring

machines. Measurement 2010, 43, 837–841. [CrossRef]

7. Feng, C.X.J.; Saal, A.L.; Salsbury, J.G.; Ness, A.R.; Lin, G.C.S. Design and analysis of experiments in cmm

measurement uncertainty study. Precis. Eng. 2007, 31, 94–101. [CrossRef]

8. Saito, K.; Miyoshi, T.; Yoshikawa, H. Noncontact 3-D digitizing and machining system for free-form surfaces.

CIRP Ann. Manuf. Technol. 1991, 40, 483–486. [CrossRef]

9. Gong, C.; Yuan, J.; Ni, J. Nongeometric error identification and compensation for robotic system by inverse

calibration. Int. J. Mach. Tools Manuf. 2000, 40, 2119–2137. [CrossRef]

10. Ye, S.; Wang, Y.; Ren, Y.; Li, D. Robot Calibration Using Iteration and Differential Kinematics. J. Phys. Conf. Ser.

2006, 48, 1–6. [CrossRef]

11. Li, J.; Guo, Y.; Zhu, J.; Lin, X.; Xin, Y.; Duan, K.; Tang, Q. Large depth-of-view portable three-dimensional

laser scanner and its segmental calibration for robot vision. Opt. Lasers Eng. 2007, 45, 1077–1087. [CrossRef]

12. Larsson, S.; Kjellander, J.A.P. An Industrial Robot and a Laser Scanner as a Flexible Solution towards an

Automatic System for Reverse Engineering of Unknown Objects. In Proceedings of the ASME Biennial

Conference on Engineering Systems Design and Analysis, Manchester, UK, 19–22 July 2004; pp. 341–350.

13. Yu, C.; Chen, X.; Xi, J. Modeling and calibration of a novel one-mirror galvanometric laser scanner. Sensors

2017, 17, 164. [CrossRef] [PubMed]

14. Li, J.; Chen, M.; Jin, X.; Chen, Y.; Dai, Z.; Ou, Z.; Tang, Q. Calibration of a multiple axes 3-D laser scanning

system consisting of robot, portable laser scanner and turntable. Opt.-Int. J. Light Electron Opt. 2011,

122, 324–329. [CrossRef]

Sensors 2017, 17, 2886 13 of 13

15. Stenz, U.; Hartmann, J.; Paffenholz, J.A.; Neumann, I. A framework based on reference data with

superordinate accuracy for the quality analysis of terrestrial laser scanning-based multi-sensor-systems.

Sensors 2017, 17, 1886. [CrossRef] [PubMed]

16. Yin, S.; Ren, Y.; Guo, Y.; Zhu, J.; Yang, S.; Ye, S. Development and calibration of an integrated 3D scanning

system for high-accuracy large-scale metrology. Measurement 2014, 54, 65–76. [CrossRef]

17. Kumar, U.P.; Somasundaram, U.; Kothiyal, M.P.; Mohan, N.K. Single frame digital fringe projection

profilometry for 3-d surface shape measurement. Opt.-Int. J. Light Electron Opt. 2013, 124, 166–169. [CrossRef]

18. Bräuerburchardt, C.; Breitbarth, A.; Kühmstedt, P.; Notni, G. High-speed three-dimensional measurements

with a fringe projection-based optical sensor. Opt. Eng. 2014, 53, 112213. [CrossRef]

19. Gao, B.Z.; Wang, M.; Peng, X.; Liu, X.; Yin, Y. Fringe projection 3D microscopy with the general imaging

model. Opt. Express 2015, 23, 6846–6857.

20. Zhang, C.; Zhao, H.; Gu, F.; Ma, Y. Phase unwrapping algorithm based on multi-frequency fringe projection

and fringe background for fringe projection profilometry. Meas. Sci. Technol. 2015, 26, 045203. [CrossRef]

21. Zhang, C.; Zhao, H.; Zhang, L.; Wang, X. Full-field phase error detection and compensation method for

digital phase-shifting fringe projection profilometry. Meas. Sci. Technol. 2015, 26, 035201. [CrossRef]

22. Chen, X.; Xi, J.T.; Jiang, T.; Jin, Y. Research and development of an accurate 3D shape measurement system

based on fringe projection: Model analysis and performance evaluation. Precis. Eng. 2008, 32, 215–221.

23. Chen, X.; Xi, J.; Jin, Y. Accuracy improvement for 3D shape measurement system based on gray-code and

phase-shift structured light projection. In Proceedings of the International Symposium on Multispectral

Image Processing and Pattern Recognition, Wuhan, China, 15 November 2007.

24. Paoli, A.; Razionale, A.V. Large yacht hull measurement by integrating optical scanning with mechanical

tracking-based methodologies. Robot. Comput.-Integr. Manuf. 2012, 28, 592–601. [CrossRef]

25. Chen, X.; Xi, J.; Jin, Y.; Sun, J. Accurate calibration for a camera–projector measurement system based on

structured light projection. Opt. Lasers Eng. 2009, 47, 310–319. [CrossRef]

26. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000,

22, 1330–1334. [CrossRef]

27. Rusu, R.B. Semantic 3D object maps for everyday manipulation in human living environments. KI Künstliche Intell.

2010, 24, 345–348. [CrossRef]

28. Mitra, N.J.; Nguyen, A. Estimating surface normals in noisy point cloud data. Int. J. Comput. Geom. Appl.

2004, 14, 0400147. [CrossRef]

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access

article distributed under the terms and conditions of the Creative Commons Attribution

(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

- Reverse Engineering of Mechanical DevicesDiunggah olehChethan Rao H
- Reducing Paper in Todays Digital WorldDiunggah olehICM Document Solutions
- NewPediatricProtocolsTutorial_041108Diunggah olehRu Raw
- Rapid Pro to Typing Machine &REDiunggah olehMukesh Muraleedharan Nair
- Midshire Business Systems - Lexmark XC2132 - MFP BrochureDiunggah olehadietopping
- Fi 5x50c Replacement GuideDiunggah oleheaRLYBIRDDISCO
- Calibration Laboratory NotesDiunggah olehBusani Ndlovu
- Presentation September 20 2011 2003Diunggah olehNiall J Kane
- BizhubC252 BrochureDiunggah olehDaniel Oltian
- BJC2000SMDiunggah olehEduard Popescu
- Reverse Engineering is the Process of Taking a Finished Product and Reconstructing Design Data in a Format From Which New Parts or Molds Can Be ProducedDiunggah olehManveen Bhullar
- remotesensing-03-01691Diunggah olehAndré Quirino
- RA TechDiunggah olehСвидло Александр
- Metrology and MeasurementsDiunggah olehharim_me
- SM_fax 111.pdfDiunggah olehCristian Bobaru
- Check Your English Vocabulary for Computers and Information TechnologyDiunggah olehhongkimcheang
- 103.241.147.158 Ongc ONGC Candidate GATEInstructionsDiunggah olehCyrilJohnson
- Bizhub 185 BrosuraDiunggah olehНенад Стијеља
- Slides Missing Neighbours[1]Diunggah olehMoncef Manai
- Computers & Information TechnologyDiunggah olehElla Violet Amel Escles
- CANON Color Image Sensor Heads (Scanner Head)Diunggah olehhindukusc80
- Implementation of multiple 3D scans for error calculation on object digital reconstructionDiunggah olehAnonymous 7VPPkWS8O
- Canon Imagerunner Ir 5570 Ir 6570 Service Manual FreeDiunggah olehcanon ir5075
- Handout_10279_OG10279-Point Cloud to AutoCAD Plant 3DDiunggah olehvivek
- PhotoSign.pdfDiunggah olehMotiur Rahman
- Lexmark x792Diunggah olehJuanin Juanjaja Ylaweactm
- Epson 1650 Scanner ManualDiunggah olehkbgro
- Samsung Scx5835fnDiunggah olehAbo Elmagd Abo Elsoud
- 14_agriculture01Diunggah olehClever Mamani Condori
- pdfDiunggah olehPursottam Saraf

- Hadoop With PythonDiunggah olehCarlosEduardoC.daSilva
- Measuring Software Test Verification for Complex WDiunggah olehkalvino314
- Intermittent Failures in Hardware and SoftwareDiunggah olehkalvino314
- 103-162-1-PBDiunggah olehJuan Posada G
- Jigs & Fixture DesignDiunggah olehLiniel de Jesus
- Machinist Course - Precision Measuring and GagingDiunggah olehmerlinson1
- ISO-10110-Optical-Drawing-Standards.pptxDiunggah olehkalvino314
- NIST.GCR.15-1009Diunggah olehkalvino314
- saillenfest2013_UUSCMJINDiunggah olehkalvino314
- COPT（A-C++-Open-Optimization-Library）-Zhouwang-YangRuimin-WangDiunggah olehkalvino314
- How to Make Mistakes in PythonDiunggah olehAnuj Deshpande
- Sheet MetalDiunggah olehkalvino314
- ICIT 2011 v3Diunggah olehkalvino314
- LAMDAMAP01030FUDiunggah olehkalvino314
- 39 43 Wojty a Jakubiec P Owucha COMPARISON 196-199Diunggah olehkalvino314
- Gdt Education 001Diunggah olehkalvino314
- zhang xDiunggah olehkalvino314
- Ultra Precision m Aching SystemDiunggah olehVictor Pugliese
- FitsDiunggah olehsaiem
- IMEKO-WC-2012-TC14-P1Diunggah olehkalvino314
- 2009 12 WSchmid NCSLi MeasureDiunggah olehkalvino314
- eb4069135_F_enDiunggah olehkalvino314
- Basics of Measurement(1)Diunggah olehkalvino314
- Roll Angle in 6DOF TrackingDiunggah olehkalvino314
- 5513f6900cf283ee08349ae5Diunggah olehkalvino314
- Review of the New Y14 5 2009 StandardDiunggah olehPhillip Grim
- SPIE6668-15Aug2007Diunggah olehkalvino314
- Dhupia_JVC_5-08Diunggah olehkalvino314
- Op to Mech Design TutorialDiunggah olehkalvino314

- CRS-02-02R AS9120-2002 Audit Checklist.docDiunggah olehJohan Hunter
- Supplier Parts Quality ManualDiunggah olehkyu9999
- TP-1 for CTDiunggah olehsathiyaseelan
- Payload Meter II - ManualDiunggah olehzender07
- Pyroelectric and Rp HeadsDiunggah olehjasin_rasheed
- OIML R 95_Ships’ tankDiunggah olehanafado
- IBRID MX6Diunggah olehCovalciuc Bogdan
- STR 1.5Diunggah olehAmin Syah
- TM2000Diunggah olehMuhammad Afrizal Kautsar
- RP_21Diunggah olehmuomemo
- Roughness ManualDiunggah olehTariq Hasan
- Data Sheet FDMDiunggah olehYanet Perez
- 8 Hints for Better Measurement - AgilentDiunggah olehmichelle_heathkit
- SIWAREX_WP231_Quick_Guide_SIWATOOL_V1_5 (1).pdfDiunggah olehJederVieira
- 22083133DDiunggah olehmakenodima
- Scan Plans PautDiunggah olehreiazh
- 95-8616-8.2_GT3000Diunggah olehsithulibra
- FHWA-NH-RD-12323W.pdfDiunggah olehbakri2016
- RET Antenna Installation Using Kathrein PCA)Diunggah olehEnder Arciniegas
- pxie5644r calibrationDiunggah olehcisco211
- ASTM E-543-09 QADiunggah olehmpus
- Manual of Operations For Drug Testing Laboratories.docxDiunggah olehSarah Jane V. Denia
- 7.6 Control of Monitoring and Measuring DevicesDiunggah olehejub6447
- Manual Conducell 4UxFDiunggah olehcadothanhdo
- 6-Series Service ManualDiunggah olehHector2461
- Batch RecordDiunggah olehNajat Albarakati
- Dualscope-MP0Diunggah olehpablitochoa
- FD910 Density MeterDiunggah olehsaid_rahmansyah4750
- CH4Diunggah olehYahya Aleswed
- Engine Speed&Timing Sensor - Calibrate (RENR9319)Diunggah olehCesar Arturo Pajuelo Espinoza