0 ratings0% found this document useful (0 votes) 425 views15 pagesDimensional Vehicle
System to recognized type of vehicle through 3D form
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
Overview of three-dimensional shape
measurement using optical methods
Frank Chen
Ford Motor Company
Research & Vehicle Technology
(M22, Room 2608C, AEC
20000 Rotunda Dr., P.O. Box 2053,
Dearborn, Michigan 48121
E-mail: fchen1 @ford.com
‘Abstract. We first provide an overview of 3-D shape measurement us-
ing various optical methods. Then we focus on structured light tech-
hiques where various optical configurations, image acquisition tech.
niques, data postprocessing and analysis methods and advantages and
limitations are presented. Several industrial application examples aro
presented. Important areas requiring further R&D are discussed. Finally,
{2 comprehensive bibliography on 3-D shape measurement is included,
although itis not intended to be exhaustive. © 2000 Secity of Photo-Optical!
Insrumertation Engineers (SO091-2286(00]00101-%)
Subject terms: three-dimensional shape measurement; coordinate measure
Gordon M, Brown, FELLOW SPIE
Optical Systems Engineering
1853 Timarron Way
Naples, Florida 34109-3319
E-mail:’ GMBrownOSE@ aol.com
‘ent; optical methods; overview.
Papor received July 12, 1
revised manuscript received Aug. 23, 1999;
‘accepted for pubiicalion Aug. 23, 1299,
‘Mumin Song
Ford Motor Company
Ford Research Laboratory
(M3135, Room 2162, SRL
20000 Rotunda Dr., P.O. Box 2053
Dearborn, Michigan 48121
E-mail: msong1 @ford.com
11 Introduction
In industry, there is a need for accurately measuring the
3D shapes of objects to speed up and ensure product de-
velopment and manufacturing quality. A variety of appli
tions of 3-D shape measurement include control for intel
gent robots, obstacle detection for vehicle guidance,
dimension measurement for die development, stamping
panel geometry checking, and accurate stress/strain and vi-
bration measurement. Moreover, automatic on line inspec-
tion and recognition issues can be converted to the 3-D
shape measurement of an object under inspection, for ex-
ample, body pane! paint defect and dent inspection. Re-
cently, with the evolution in computer technologies,
coupled with the development of digital imaging devices,
electro-optical components, laser and other light sources,
3-D shape measurement is now at the point that some tech~
niques have been successfully commercialized. For a small-
scale depth or shape, micrometer or even nanometer mea-
surements can be reached if a confocal microscope or other
3-D microscope is used. However, the key is the relative
accuracy or one part out of the measurement depth. T
‘poses a real challenge for a large-scale shape measurement.
For example, how accurate can a 0.5 m depth measurement
be? Moreover, for a large-scale depth and shape measure-
‘ment, frequently, more cameras and camera positions are
required to obtain several shapes from which the final large
shape can be patched. This raises the issue of how to patch
these shapes together in a highly accurate manner and per-
form local and global coordinate transforms. This subse-
quently generates another problem to be solved, namely, to
10 Opt. Eng. 29(1) 10-22 (January 2000)
(0091-3286/20001515.00
‘overcome lens distortion and aberrations. After the 3-D
shape is obtained, this data must be compared with a com-
puter aided engineering (CAE) model,
‘This paper provides an overview of 3-D shape measure-
‘ment using various optical methods, Then it focuses on
structured light measurement systems for measuring rela-
tively large scale and 360 deg shape. It then outlines vari-
‘ous detail aspects such as absolute phase measurement,
structured light sources, image acquisition sensors, camera
‘model and calibration, followed by discussion of global and
local coordinate translation methods. Point cloud patching
and CAD data comparison are also discussed, Several ap-
plications are described. Finally, future research trends
such as real time computing, automating and optimizing
sensor placement and the need for a common standard for
the evaluation of optical coordinate measurement systems
(OCMSS), etc., are presented,
2. Optical 3-D Measurement Techniques
Various optical techniques have recently been developed
for measuring 3-D shape from one position. A comprehen-
sive overview for some of the techniques can be found in
Ref. 1
241 Time/Light in Flight
The time of flight method for measuring shape is based on
the direct measurement of the time of flight of a laser or
other light source pulse.” During measurement, an object.
pulse is reflected back to the receiving sensor and a refer-
ence pulse is passed through an optical fiber and received
by the sensor. The time difference between the two pulses
is converted to distance. A typical resolution for the time of
(© 2000 Society of Photo-Optical Instrumentation Engineers.Fe
Chen, Brown, and Song: Overiow of three-dimensional shape measuroment
flight method is around a millimeter. With subpicosecond
pulses from a diode laser and high-resolution electronics,
Submillimeter resolution is achievable. ‘The recently
reported’ time correlated single photon counting method
has a depth repeatability better than 30 yam at a stand off
distance of 1 m. Another similar technique is called light-
in-flight holography where either short temporal coherence
light or very short light pulse is used to generate a motion
image of a propagating optical wavefront.'* Combined
with digital reconstruction and a Littrow setup, the depth
resolution may reach® 6.5 um,
22 Laser Scanning
Point laser triangulation employs the well-known triangu-
lation relationship in optics. The typical measurement range
is £5 to +250 mm, and accuracy is about 1 part in 10,000
‘and measurement frequency of 40 kHz or higher.”* A
‘charged couple device (CCD), or a position sensitive detec-
tor (PSD) is widely used to digitize the point laser image.
For a PSD, the measurement accuracy is mainly dependent
‘on the accuracy of the image on the PSD. The beam spot
reflection and stray light will also affect the measurement
accuracy. Idesawa” developed some methods to improve
the accuracy of the PSD by using a high accuracy Kaleido-
scopic mirror tunnel position sensing technique (KM-PSM)
‘and a hybrid type of position sensitive detector (R-HPSD).
CCD based sensors avoid the beam spot reflection and stray
light effects and provides more accuracy because of the
single pixel resolution. Another factor that affects the mea-
surement accuracy is the difference in the surface charac-
teristic of a measured object from the calibration surface.
Usually calibration should be performed on similar surfaces
to ensure the measurement accuracy. The recently devel-
coped confocal technique can tolerate surface color change,
transparency, difference, and inregularity without
calibration.!®
23 Moiré
‘The moiré method can be divided into, shadow and the
‘more practical projection techniques."'"? The key to the
‘moiré technique is two gratings, one is a master grating and.
the other is a reference grating, from which contour fringes
ccan be generated and resolved by a CCD camera. Increased.
resolution is realized since the gratings themselves do not
need to be resolved by the CCD camera. However, if the
reference grating is computer generated, as in the logic-
‘moiré method,!*" the master grating must be resolved by
the camera. The penalties for the high resolution are the
implementation complexity and the need for a high power
light source as compared with a structured light technique.
To (1) overcome environmental perturbations, (2) increase
image acquisition speed, and (3) utilize phase shift methods
to analyze the fringe pattern, snap shot or multiple image
rmoiré systems have been developed. Two or more moiré
fringe patterns with different’ phase shifts are simulta-
neously acquired using molt-camera or image-spliting
methods.!"-® Reference 20 provides a comparison of some
high speed moiré contouring methods with particular stress
‘on sources of noise and system error functions. The typical
‘measurement range of the phase shifting moiré method is
from 1 mm to 0.5 m with the resolution at 1/10 to 1/100 of
a fringe. Some novel applications and related references
ccan be found in Refs. 21-30.
2.4 Laser Speckle Pattem Sectioning
‘The 3D Fourier transform relationship between optical
wavelength (frequency) space and the distance (range)
space is used to measure the shape of an object." Laser
radar 3D. imaging, also known as speckle pattern
sampling,**~*” is achieved by utilizing the principle that the
optical field in the detection plane corresponds to a 2-D
slice of the object's 3-D Fourier transform. The other 2-D
slices of the object's 3-D transform are acquired by chang-
ing the wavelength of the laser. A speckle pattern is mea~
sured using a CCD array at each different laser wavelength,
and the individual frames are added up to generate a 3-D
data array. A 3-D Fourier transform is applied on this data
array to obtain the 3-D shape of an object. When a refer-
tence plane method is used, this technique is similar to
two-wavelength or multiwavelength speckle interferom-
etry. The measurement range can be from a micrometer to
‘a few meters, The accuracy is depended on the measure-
ment range. With the current laser technology, 1- to 10-m
resolutions is attained in the measurement range of 10 mm
and 0.5 am measurement uncertainty is achievable (see
HoloMapper in the commercial system list presented in
‘Table 1). The advantages of this technique are (1) the high
flexibility of the measurement range and (2) phase shifting
as in conventional interferometry may not be required. The
limitation of this technique is that for relatively large scale
shape measurement, it takes more time to acquire the im-
ages with the different wavelengths.
25 Interferometry
‘The idea behind interferometric shape measurement is that
the fringes are formed by variation of the sensitivity matrix
that relates the geometric shape of an object to the me
sured optical phases. The matrix contains three variables,
‘wavelength, refractive index, and illumination and observ
tion directions, from which three methods, two or multiple
wavelength,” refractive index change," and illumi-
nation direction variationtwo sources®”-*! are derived. The
resolution of the two-wavelength method depends on the
equivalent wavelength (A) and the phase resolution of
A/200, For example, two lines of an argon-laser (0.5145
and 0.4880 zm) will generate an equivalent wavelength
9.4746 jum and a resolution of 0.047 yam.
‘Another range measurement technique with high accu-
racy is double heterodyne interferometry using a frequency
shift. Recent research shows it achieves a remarkable 0.1
mm resolution with 100 m range.” Interferometric methods
have the advantage of being mono-state without the shad-
ing problem of triangulation techniques. Combined with
phase shifting analysis, interferometric methods and hetero-
yne techniques can have accuracies of 1/100 and 1/1000
of a fringe, respectively. With dedicated optical configura-
tion design, accuracy can reach 1/10,000 of a fringe.** The
other “methods such as shearography,"*®* diffraction
grating?” digital wavefront reconstruction and. wave-
Tength scanning,” and conoscopic holography" are also
‘under development. Both shearography and conoscopic ho-
‘Optical Engineering, Vol. 29 No. 1, January 2000 11(Chen, Brown, and Son
(Overview of three-dimensional shape measurement...
Table 1 Somo Full Field Shape Measurement Commercial Systems Based on Leading Edge Tech-
ologios
‘System Principia ‘Accuracy Dependent an Volume
‘los Systom ‘Structured ight photogrammoty; ‘About 50 ua (26) on a relatively large volume
Caputure 3D, Costa Mesa, 360-deg view/patching
GA 92626, 1-714-546-7278
CometOptoTrak System
4000 Grand Fiver Avenue,
Sute 101, Nov, Mi 48375,
mmikeb@ sieinbichler com
‘Optiga/CogarTens Systom
USS. 815-697-1026
401 System
1, 149 Sidney Strect, Cambridge,
MA 02139, 617-354-3690,
HoloMapper Systern
ERIM Intorational, In,
41975 Groen Road, Ann Arbor,
Mi 48105, 318-904-0287
'360-deg viewipatching
one view!no patching
one viewino patching
Structured light +optical tracking:
‘Random dot pattern+ photogrammetry
*tilinear tensor, 360-deg view/patching
Structured light + eal time computing:
Lasor radar/mutipe wavelength;
‘About 50 ua (2) on a relatively large volume
‘About 20 10 100 wm (20) on a relatively large volume
‘About 10-* on a medium volume
Uncertainty 0.5 un on a medium volume
lography can be common paths and are collinear systems
that are relatively immune to mechanical disturbances.
2.6 Photogrammetry
Typical photogrammetry employs the stereo technique to
measure 3-D shape, although other methods such as defo-
cus, shading and scaling can also be used, Photogrammetry
is mainly used for feature type 3-D dimension measure-
‘ment. It must usually have some bright markers such as
retroreflective painted dots on the surface of a measured
object. In general, photogrammetric 3-D reconstruction is
established on the bundle adjustment principle in which the
‘geometric mode! of the central perspective and the orienta-
tion of the bundles of light rays in a photogrammetric rela-
tionship is developed analytically and implemented by a
least squares procedure.*? Extensive research has been done
to improve the accuracy of photogrammetry.®-®" Recent
advances make it achieve high accuracy as one part in
100,000 or even one part in 1,000,000.
2.7 Laser Tracking System
‘A laser tracker uses an interferometer to measure distances,
and two high accuracy angle encoders to determine vertical
and horizontal angles. The laser tracker SMART 310, de-
veloped at the National Bureau of Standards, was improved
at API (Automated Precision Inc.) to deliver I-4m range
resolution and 0.7-arcsee angular resolution. The laser
tracker isa scanning system and usually is used to track the
positions of optical sensors or robots. The Leica LTD 500
system can provide an absolute distance measurement with
accuracy about +50 jam and angle encoders that permit
accuracy of 5 parts in a million within a 35-m radius mea-
surement volume.
2.8 Structured Light
The structured light method, also categorized as active tri-
angulation, includes both projected coded light and sinu-
soidal fringe techniques. Depth information of the object is
encoded into a deformed fringe pattern recorded by an im-
‘age acquisition sensor.”"-“*Although related to projection
12 Optical Engineering, Vol 39 No. 1, January 2000
‘moiré techniques, shape is directly decoded from the de-
formed fringes recorded from the surface of a diffuse object
instead of using a reference grating to create moiré fringes.
‘Two other related techniques use projected random pattems
and a trilinear tensor.’ When a LCD/digital mirror de-
vice (DMD) based and optimized shape measurement sys-
tem is used, the measurement accuracy may be achieved”
at one part in 20,000. The structured Tight method has the
following merits: (1) easy implementation; (2) phase shift-
ing, fringe density and direction change ‘can be realized
with no moving parts if a computer controlled LCD/DMD
is used; and (3) fast full field measurement. Because of
these advantages, the coordinate measurement and machine
vision industries have started to commercialize the struc-
tured light method (see Table 1) and some encouraging
applications can be found in Refs. 80-82. However, 10
‘make this method even more accepted in industry some
issues have to be addressed, including the shading problem,
which is inherent to all triangulation techniques. The 360-
2g multiple view data registration and defocus with pro:
jected gratings or dots show promise for a solution"
‘The following sections touch these areas, For small objects
using a microscope, lateral and depth resolutions of 1 and
0.1 jum, respectively, can be achieved.°**" Using confocal
microscope for shape measurement can be found in Ref.
88,
3 General Approach to Measure 360-deg Shape
of an Object
The global coordinate system is set up and local coordi-
nates systems are registered during measurement. A struc-
tured light imaging system is placed at an appropriate po-
sition to measure 3-D shape from one view, and the
absolute phase value at each object point is calculated.
These phase values and a geometric-optic model of the
measurement system determine the local 3-D coordinates
of the object points. Three ways are usually used to mea
sure 360-deg shape, the object rotation method.” the
camera/imaging system transport technique, and the fixed
aging system with multiple cameras approach, For cam-
cra transport, which is usually for measuring large objects,
’
tCChen, Brown, and Song: Overiow of three-dimensional shape measurement
jobal and local coordinate systems.
the measurement is repeated at different views to cover the
measured object. All the local 3-D coordinates are trans-
formed into the global coordinate system and patched to-
gether using a least squares fit method. The measured final
3-D coordinates of the object can be compared with CAD
‘master data in a computer using various methods in which
the differentiation comparison technique and least square fit
are often used.
4 Global and Local Coordinates Translation
For a 360-deg 3-D measurement of an object, an optical
sensor must be positioned at different locations around the
object. The point clouds obtained at each position must be
input or transformed into global coordinates from each lo-
cal coordinate so that these point clouds can be patched
together to generate the finial data set. To accomplish this,
each sensor coordinate system location and orientation
‘must be known or measured, Any error in measuring and
calculating the sensor location and orientation will cause a
‘propagation error in the global coordinates, which will pre-
ent a high overall accuracy of the final measurement,
‘There are several approaches to determine the relationship
between the global and local coordinate. system:"-”* (1)
aecurate mechanical location and orientation of the sensor
(local coordinate system), (2) optical tracking of the loca-
tion and orientation of the sensor using active or passive
targets attached to the sensor, and (3) photogrammetry of
markers accurately fixed in the object field and hybrid
methods. Fig. 1 is a diagram showing these approaches.
For the mechanical approach 1, the sensor is attached on
‘a mechanical positioning system of high accuracy. The lo-
cation and orientation of the sensor are derived from the
system coordinate and angle information. The advantage of
the mechanical system is that it is robust and has high ac-
curacy. However, the cost for accuracy in mechanical de-
vices, overcoming environmental perturbation, and mainte-
nance of the equipment is very high.
For the optical approach 2, the local coordinate system
is calculated from the measured global position of reference
targets (active or passive, with the known local coordi-
‘Sensor planning dfagram showing several approaches to detotmine the relationship between
nates) on the frame of the optical sensor using an optical
tracker system, The advantage is portability and compact:
ness. However, the sensor targets must be visible and this
limits the flexibility. Moreover, the floor vibration effect
must be considered. If a high accuracy tracking system is
ed, such as laser tracking system, the cost is also rela-
tively high, Both mechanical and optical methods are prone
to angular error.
‘The photogrammetry approach 3 can provide high accu-
racy local coordinate system location and orientation from
‘measurement of the global coordinates of makers accu-
rately fixed in the object field. The accuracy can be as high
1s one part in a million.® Conservatively, accuracy can be
‘as one part in 100,000. However, the key limitation for this
‘method is that registration markers must be placed on or
‘around the object. This increases the measurement time and
automation complexity.
5 Structured Light Sources, Image Sensors,
‘Camera Model and Calibration
‘The light source is important for the overall accuracy of a
3+ shape measurement system. Important parameters in-
‘lude uniformity, weight, intensity profile and speckle/dot
size. The projection of a Ronchi grating slide provides high
resolution with bright images and is currently used in some
‘commercial systems. However, to calculate absolute dis-
tance, multiple grating slides are needed to apply the phase
shift method and to vary the grating frequency. This in turn
results in slow speed and relatively large space for storing
diferent gratings. Around 1991 to 1993, liquid crystal pro-
jectors (LCDs) using incoherent light’! were used in
Which each pixel can be addressed by a computer image
generating system. The advantage of this type projection is
the high speed for phase shifting and variable grating fre-
quency. The disadvantage is that LCDs require powerful
light sources, resulting in cooling concems and increased
‘weight, Moreover, the resolution is low compared with the
film slide based light projection. To overcome the bright-
ross concer of the LCD, the reflective LCD, the gas
plasma display (GPD) and the DMD have been
Optical Engineering, Vol. 39 No. 1, January 2000 19.Chen, Brown, and Song: Overview of three-cimensional shape measurement.
developed.!"*"% In addition, the gaps between DMP mir-
rors are smaller than those pixels between the LCD so that
the DMD images are relatively sharper. A detailed error
analysis and optimization of a shape measurement system
using a LCD/DMD type fringe projector can be found in
Ref. 106. The LCD, GPD and DMD have RGB color by
which simultaneously acquisition of three images or three
phase shifted images can be used, and this makes the phase
shift technique immune to environmental perturbation.!”
‘The color advantage may also be used for ebsolute phase
determination." Other light sources are the two point
source laser, interferometer using a Mach-Zhender
configuration," fiber optics." birefringence crystal!" an
acoustic optical modulator"! (AOM) and Lasiis's non-
Gaussian strctured light projector using a special designed
prism that can generate"? 99 lines with interbeam angle at
0.149 deg
In optical 3-D shape measurement, image acquisition is
a key factor for accuracy. Currently, images are acquired
using @ CCD or a charge injection device (CID) sensor.
‘There are full frame, frame transfer and interline transfer
sensors. The major concerns regarding these sensors are the
speed, resolution, dynamic range, and accuracy. Up 10
5kX5k pixel CCD sensors are commercially available such
as DALSA TA-D9-5120, Ford Aerospace 4kX4k, Kodak
Model 16.81 (4kx4k), Loral CCD481. to name a few. Usu-
ally, the high resolution sensor is a full frame CCD that
oes not have storage and requires a shutter to enable im-
age transfer and results in relatively slow speed. Combined
‘micro and macroscanning techniques, the image reso-
lution can be as high as 20k20k, which is equivalent to
the resolution of a 20X20 cm area photograph with 100
lines/mm. The CID sensor differs from CCD sensor in that
it does not bloom through overexposure, and can be read
‘out selectively since each pixel is individually addressed.
A high accuracy CCD sensor or a video camera requires
high radiometric and geometric accuracy, including both
intrinsic parameters such as lens distortion and extrinsic
Parameters such as the coordinate location and orientation
of the camera. A. detailed discussion regarding characteriza
tion and calibration of radiometric and geometric feature of
a CCD sensor can be found in Ref. 118. The relative accu-
racy of I part in 1000 can be achieved using on site auto-
matic calibration during measurement. More accurate cali-
bration, such as 10°* to 10° accuracy, may be achieved
using a formal off line calibration procedure with a more
complex and nonlinear camera model. A high accuracy
camera model is necessary to characterize the lens aberra-
tion and to comet the captured image for the distortions
caused by the aberration." The calibration of an opti-
cal measurement system can be further divided into an geo-
‘metric parameter technique, as described earlier, and geo-
‘metric transformation approach. The geometric parameter
technique requires the known parameters of the optical
setup including the projector and image sensor. On. the
other hand, the geometric transformation approach'2®-!33
ddoes not require knowledge of the parameters of the image
system, in which Ref. 131 presents the recently developed
projection or image ray tracing technique and Ref. 132 pro-
vides the known position of the object or camera variation
approach. Once the imaging system is moved or the mea-
sured object size/depth is changed, the calibration proce-
14 Optical Engineering, Vol. 99 No. 1, January 2000
dure may need to be performed again. This, however, may
pose some limitation for this method, Reference 134 devel-
‘oped a self-calibration approach that may reduce the com-
plexity of calibration procedure and increases accuracy.
6 Absolute Phase Value Measurement and
Discontinuity Ambiguity Overcome
JIn general, using phase shifted structured light to measure
the 3-D shape of the object renders only relative phase
Values, Phase shifting determines the fractional ordet of
fringes at any pixel. These fractional orders are connected
together using their adjacent integer orders, which isthe so
called unwrapping process. However, when the phase dif-
ference between the adjacent pixels i larger than 27, such
as occurs at a discontinuity or steep change of shape, the
integer fringe order becomes ambiguous, Recently, several
methods have, been developed 0 overcome
discontinuities." The basic idea is that changing the
‘measurement systems sensitivity results in fringe or pro-
‘jected structured strip density changes, This means that the
integer order of the fringes sweep through the discontinu-
ity. can be viewed in both spatial and temporal domains
and results in various different methods. As mentioned in
Ref, 144, the key to overcoming discontinuity is to deter-
mine the integer “fringe"” order duting unwrapping pro-
cess. These methods, such as two wavelength or parameter
change, are used in interferometry to determine absolute
fringe fractional and integer orders.'®! In plane rotation
of the grating and varying the grating frequency (e..,
fringe projection with two point variable spacing) are use
ful techniques. Triangulation and stereography can also be
employed to determine absolute phase values and over-
come discontinuity, although there are limitations since all
pixel points may not be covered without changing viewing
direction. Some direct phase calculation, such as phase de~
Fivative methods without phase unwrapping, may still need
continuous condition." However, the same problem can
Also be solved by phase locked loop technique."
7 image Data Patching and CAD Data
Comparison
After processing the 360-deg local images, the local point
cloud patches must be merged together to obtain a final
slobal point cloud the object. The accuracy of a measure-
ment system is also determined by the matching accuracy.
There is extensive research on the matching methods and
algorithms in photogrammetry, which can generally be cat-
eporized as area based matching, feature based matching
and other methods such as centroid method."°™"”? The area
based matching takes advantage of correlation coefficient
‘maximization and least squares minimization, while feature
based matching exploits all algorithms extracting features
of points, lines, areas. Area based matching usually em.
ploys pixel intensity as a constrain, while feature based
matching uses a geometric constraint. All of these methods
require subpixel accuracy to achieve overall accuracy. Un-
4det_ optimized, condition, 0.02 pixel accuracy can be
achieved," and in general 0.05 pixel accuracy should
be obtained." There is a discussion of subpixel accuracy
versus geometric accuracy where geomeiric accuracy iS
‘more promising.”°|
'
'
+
$
|
>
For CAD data comparison, the differentiation and least
mean squares methods are mainly used. The measured
point cloud data are subtracted from CAD data to obtain
Gifferences as an error indicator. The comparison to the
‘appropriate CAD model can be used to obtain the best fit of
registration between the two. Model matching can start
with a selection of a subset of point cloud data. The mea-
sured point data in this subset are matched to the CAD data
bby making the normal vector collinear to the normal vector
of the local CAD surface. The distances in the normal di-
rection between the CAD surface and measured point cloud
are fed into a least squares error function. The best fit is
achieved optimizing the least squares error
function.!”""" Before the measured data can be compared
to CAD master data, it must be converted into standard
CAD representations."®°-™ This is usually done by first
splitting the measured data into major geometric entities
and modeling these entities in the nonuniform rational
B-spline surface form; this has the advantage of describing
the quadric in addition to free form surface using a com-
‘mon mathematical solution.
8 Sensor Planning
Large-scale surface inspection often requites either mul-
tiple stationary sensors or relocation of a single sensor for
completing a 3-D information. A multsensor system has an
advantage in high-volume inspection of similar products,
but usually lacks flexibility. To improve the flexibility,
various portable sensor systems and automated eye-on-
hand systems are produced. However, no matter what kinds
of sensor systems will be used, the first and most critical
problem that should be solved is how the sensor(s) can be
placed to successfully view the 3-D object without missing
fequired information. Given information about the environ-
‘ment (e.g. the observed objects and the available sensors)
‘and information about the mission (Le., detection of certain
‘object features, object recognition, scene reconstruction,
object. manipulation, and accurate, dense enough point
‘louds), strategies should be developed to determine sensor
ers that achieve the mission with a certain degree of
Satisfaction. Generally solving such a problem is catego-
rized as a sensor planaing problem. Considerable effort on
general techniques has been made in sensor planning. We
ean collect them into the four categories
8.1 Generate and Test
By using this method,!**"*S sensor configurations are gen-
‘erated first, then evaluated using performance functions and
‘mission constraints, To avoid an exhausting search process,
the domain of sensor configuration is discretized by tessel-
lating a viewing sphere surrounding the object under obser-
vation. This is a time-consuming technique without guaran-
teeing an optimal result.
8.2 Synthesis Approach
‘This approach'®*' is built on an analytical relation be~
tween mission constraints and sensor parameters. It has a
beautiful and promising theoretical framework that can de-
termine the sensor configurations for certain cases. The
drawback of this approach is that the analytical relations
sometimes are missing, especially when constraints are
complex.
Cchen, Brown, and Song: Overview of thrae-dimensional shape measurement. .
& Point Cloud Scan Data
Fig. 2 Measured point cloud data of a component.
8&3 Sensor Simulation System
‘This system'??-" brings objects, sensors and light sources
into a unified viral environment. It then uses the generate
and-test approach to find desired sensor configurations. The
Simulation systems are useful in the sense that operators
ccan actively evolve the process and ensure the results.
8.4 Expert System Approach
Rule-based expert systems!*™? are utilized to bridge re-
ality and expert knowledge of viewing and illumination
‘The recommended sensor configurations are the output of
the expert system from reality checking, In general, the
more complete knowledge we have, the “wiser the advice
wwe can get.
8.5. Sensor Planning Examples
In fact, these sensor planning techniques have a strong ap-
plication background. Their goal is aggressively set to im-
prove machine intelligence and reduce human-intensive op-
trations that cause the Jong development cycle time, high
cost, and complexity in modern industry. Several examples
fare (1) an intelligent sensor planning system was concep-
tually defined to be applied in the application of automated
dimensional measurements by using CAD models of mea-
sured parts; (2) an inspection system that is able to de~
termine appropriate and flexible action in new situations
since on-line sensor planning techniques were adopted;”"
and (3) the techniques were applied to a robot vision sys-
tem 0 that the orientation and position of vision sensors
and a light source can be automatically determined.”
9 Demonstration Examples
9.1 Panel Spring Back Investigation
During component development cycle, there is a try-out
phase that requires measurement of the shape of a compo-
nent or the die to make the component. The following is a
demonstration of using a 3-D optical measurement setup 10
‘measure a component and compare it to the master CAD
data to evaluate the spring back effect. Figure 2 shows the
point cloud data of a component. Figure 3 shows the cor-
responding CAD data, Figure 4 shows the comparison be~
tween the measured data and CAD data, in which the two
sets of data are compared using the least squares fit.
Optical Engineering, Vol. 39 No. 1, January 2000 15CChen, Brown, and Song: Overview of three-dimensional shape measurement
Part CAD Data
Fig. 3 Corresponding CAD data of a component
9.2. Vehicle Shape Measurement
For rapid prototyping or benchmarking, often the vehicle
body shape must be measured. The following example uses
the structured light method combined with photogrammetry
to measure a car body shape. Some coded targets were
placed on the vehicle body to enable local to global coor-
inate transformation. Then structured light was projected
on the vehicle surface combined with phase shifting and
absolute phase measurement technique using fringe fre-
‘quency change to determine the local coordinate pixel by
pixel at one view direction. Two hundred and forty view
Girections were used to cover the whole vehicle surface (it
is a real vehicle). The 240 point clouds then were patched
together using a least mean squares method. The point
cloud data were extracted using 1 out of 8 pixels. The
shaded measured data is shown in Fig. 5 and the point
cloud data is shown in Fig. 6
9.3 Vibration 8D
For accurate analysis of vibration or strain, the geometric,
information of the tested structure must be known. Using
the two-wavelength shape measurement method, the vibra-
tion amplitude, phase and geometric information of a tested
structure can be measured using a single compact electronic
speckle pattern interferometry (ESPI) setup.” Figures 7(a)
and 7(b) show the four vibration states of a corrugated plate
clamped along its boundaries and subjected to harmonic
‘excitation at 550 Hz. State 1 in Fig. 7(a) depicts the original
a
Comparison Results
Fig. 4 Comparison of the measured data and CAD data of a com-
ponent.
16 Optical Enginoering, Vol, 39 No. 1, January 2000
Fig. 5 Shaded measured data of a vehicle.
corrugated plate geometric shape and the rest are vibrating.
states at distinct times. From Fig. 7 one can clearly see the
shape effect on vibration,
94 Paint Defects
‘The geometry measurement technique can also be applied
to measure paint defects of a body panel of a vehicle, al-
though itis a challenge to detect small laws in large areas.
‘A methodology fas been developed, as shown in the flow
chart in Fig. 8 (and in the optical setup in Fig. 9), in which
structured light generated from a monitor is reflected from
the tested panel and digitized into a computer image pro-
cessing system. Then the digital Fourier transform method
is used to extract the global shape of the panel by selecting
the structured tight frequency. The defect geometry coupled
with the global shape of the panel is caleulated by selecting
half-spatial frequencies. The detect geometry is finally ob-
tained by subtracting the preceding two results, as shown in
Fig. 10, where Fig. 10(a) shows the panel with projected
structured light, and Fig, 10(b) shows the final measure-
ment result. One can see that without the calculation, one
can only observe the large defects by enhanced fringe
‘modulation. The measurement area is about 0.25%0.25 m
and the minimum defect size is about 500mm. Some
other application examples can be found, ein Refs. 205
and 206.
Fig. 6 Measured point oud data of a vehicle,‘Chen, Brown, and Song: Overview of three-dimensional shape measurement. .
Fig. 7 (a) Vibeation sate 1 depicts the orginal corrugated pate goo-
‘metic shape and (o)-(6) vibration states (D) 2, (c) 3, and (d) 4 show
tho offect of the underlying shape.
10 Conclusion and Future Research Trend
Although the principles of triangulation, structured light,
‘and interferometry have been in existence for decades, itis
‘only with the recent availability of advanced and low cost,
‘computers, electro-optical elements, and lasers that such
techniques have reached the breakthrough point to be com-
mercialized, and ever increasingly to be applied in industry.
To make it even more acceptable in industry and to strive
to achieve 10°* to 10° accuracy,”'™ some challenges
remain to be addressed. The following may suggest some
future trends.
10.1. Real Time Computing
Real time 3-D shape measurement is an ongoing request in
industry to drive down product cost and increase produc-
tivity and quality. The major impact will be in digital de-
ign, digital and physical manufacturing, and fast prototyp-
ing that streamline and integrate product design and
manufacture. Real time 3-D shape measurement is the key
for successfully implementing 3-D coordinate display and
Digiize
Fourier Trans [Power Trans
L
PeccrhisrOrr |] elect all Spee
faverse Fourier vee Fourier
ae Sa
Spat The Two Phases To Obtain The Result
Fig. 8 Diagram showing structured light gonerated from a moritor
‘and reflected from the test panel and cigiized into a computer im-
‘age processing system.
sect ih
ody NN
ra
L_]
ane |
Computer
Fig. 9 Optical setup showing structured light generated from a
monitor rellected from the panel and digitized into a computer image
processing system.
measurement, manufacturing control, and on fine quality
inspection. An encouraging example of this is jigless as-
sembly based on the real time, simultaneous measurement
fof different but related components. Described by
Hobrought” “real time" is to assign a Z-value or depth
for every pixel within a 17-ms cycle, which corresponds to
the integration time of a CCD sensor. Recently, over 100
measured points every 40 ms was achicved using
photogrammetry" and there isa report on the real
3:D shape measurement system” The key for real time is
a high computational speed that can meet on line manufac-
turing needs.
10.2 Direct Shape Measurement from a Specular
Surface without Paint
‘There is an urgent need but little research activity in the
area of using optical techniques to measure the 3-D shape
of an object with specular surface such as a die surface.
‘There are some efforts to develop techniques in this area.
References 210 and 211 proposed a technique using four
lights to measure specular features. Reference 212 em-
ployed a simplified version of the Torrance-Sparrow reflec-
tance model to retrieve object curvature; this method relied
‘on prior knowledge of surface reflectance. References 213
and 214 suggested using multiple (127) point sources to
detect specular shape. Reference 215 developed a photom-
etry sampling technique employing a matrix of extended
sources to determine the specular shape. References 216—
220 used diffusive TV screen as structured light source;
however, since the diffusive screen has to be placed close
to the measurement surface, the illuminated area is limited.
References 218-220 proposed a retroreflective screen
UM
Fig. 10 (a) Panel with projected structured ight and (b) defects in
‘the final measurement result.
pit ae
Optical Engineering, Vol. 39 No. 1, January 2000 17CChen, Brown, and Song: Overview of three-dimensional shape measurement
Fig. 3 Corresponding CAD data of a component
9.2 Vehicle Shape Measurement
For rapid prototyping or benchmarking, often the vebi
body shape must be measured. The following example uses
the structured light method combined with photogrammetry
to measure a car body shape, Some coded targets were
placed on the vehicle body to enable local to global coor-
dinate transformation. Then structured light was projected
on the vehicle surface combined with phase shifting and
absolute phase measurement technique using fringe fre-
uency change to determine the local coordinate pixel by
pixel at one view direction. Two hundred and forty view
directions were used to cover the whole vehicle surface (it
is a real vehicle). The 240 point clouds then were patched
together using a least mean squares method. The point
cloud data were extracted using 1 out of 8 pixels. The
shaded measured data is shown in Fig, $ and the point
cloud data is shown in Fig. 6.
9.3. Vibration 3D
For accurate analysis of vibration or strain, the geometric
information of the tested structure must be known. Using
the two-wavelength shape measurement method, the vibra-
‘ion amplitude, phase and geometric information of a tested.
structure ean be measured using a single compact electronic
speckle pattem interferometry (ESP1) setup.” Figures 7(a)
and 7(b) show the four vibration states of a corrugated plate
clamped along its boundaries and subjected to harmonic
excitation at 550 Hz. State 1 in Fig. 7(a) depicts the original
Comparison Results
Fig. 4 Comparison ofthe measured data and CAD data of a com-
ponent.
16 Optical Engineering, Vol. 39 No. 1, January 2000
Fig. § Shaded measured data of a vehicle
corrugated plate geometric shape and the rest are vibrating
states at distinct times. From Fig. 7 one can clearly see the
shape effect on vibration.
9.4 Paint Defects
‘The geometry measurement technique can also be applied
to measure paint defects of a body panel of a vehicle, al-
though it is a challenge to detect small flaws in large areas,
‘A methodology has been developed, as shown in the flow
chart in Fig. 8 (and in the optical setup in Fig. 9), in which
structured light generated from a monitor is reflected from
the tested panel and digitized into a computer image pro-
cessing system. Then the digital Fourier transform method
js used t0 extract the global shape of the panel by selecting
the structured light frequency. The defect geometry coupled,
with the global shape of the panel is calculated by selecting,
half-spatial frequencies. The detect geometry is finally ob-
tained by subtracting the preceding two results, as shown in
Fig. 10, where Fig. 10(a) shows the panel with projected
structured light, and Fig. 10(b) shows the final measure-
‘ment result. One can see that without the calculation, one
can only observe the large defects by enhanced fringe
‘modulation, The measurement area is about 0.250.25 m
and the minimum defect size is abour™ 500 mm. Some
other application examples can be found, e.g., in Refs. 205,
and 206.
Fig, 6 Measured point cloud data ofa vehiclechen, Brown, and Song: Overview of three-dimensional shape measurement.
@ @
Fig. 7 (2) Vibration state 1 depicts the orginal corugated plate geo-
eve shape and (b)-(d) vibration states (b) 2, () 3, and (6) 4 show
the effect ofthe underlying shape.
10 Conclusion and Future Research Trend
Although the principles of triangulation, structured light,
land interferometry have been in existence for decades, itis
only with the recent availability of advanced and low cost
computers, electro-optical elements, and lasers that such
techniques have reached the breakthrough point to be com-
mercialized, and ever increasingly to be applied in industry
‘To make it even more acceptable in industry and to st
to achieve 10-* to 105 accuracy,”'* some challenges
remain to be addressed. The following may suggest some
future trends.
10.1. Real Time Computing
Real time 3-D shape measurement is an ongoing request in
industry to drive down product cost and increase produc
tivity and quality. The major impact will be in digital de-
sign, digital and physical manufacturing, and fast prototyp-
ing ‘that streamline and integrate product design and
manufacture. Real time 3-D shape measurement is the key
for successfully implementing 3-D coordinate display and
Diaitize
Fae Tow [Fourier Trams
.
cleo Fst Ow Te Fal Spee
r i
inverse Fourier [pesRare
= h == Phase
bia The Two Piases To Obtain The Rest
Fig. 8 Diagram showing structured light generated from @ monitor
‘and reflected from the fest panel and digitized into a computer i-
age processing system.
ne Light Screen
janet)
camera
Fig. 9 Optical setup showing structured light generated from a
Freritor reflected from the panel and digtized into a computer image
processing system.
measurement, manufacturing control, and on line quality
inspection. An encouraging example of this is jigless as~
sembly based on the real time, simultaneous measurement
of different, but related components. Described by
Hobrought” “real time" is to assign a Z-value or depth
for every pixel within a 17-ms cycle, which corresponds to
the integration time of a CCD sensor. Recently, over 100
measured points every 40 ms was achieved using
photogrammetry" and there is a report on the real time
3-D shape measurement system.” The key for real time is
‘high computational speed that can meet on line manufac:
turing needs.
10.2. Direct Shape Measurement from a Specular
Surface without Paint
‘There is an urgent need but litle research activity in the
area of using optical techniques to measure the 3-D shape
‘of an object with specular surface such as a die surface.
There are some efforts to develop techniques in this area.
References 210 and 211 proposed a technique using four
Tights to measure specular features. Reference 212 em-
ployed a simplified version of the Torrance-Sparrow reflec
tance model to retrieve object curvature; this method relied
(on prior knowledge of surface reflectance. References 213
and 214 suggested using multiple (127) point sources to
detect specular shape. Reference 215 developed @ photom-
etry sampling technique employing a matrix of extended
sources to determine the specular shape. References 216—
220 used diffusive TV screen as structured light source;
however, since the diffusive screen has to be placed close
to the measurement surface, the illuminated area is limited,
References 218-220 proposed a retroreflective sere
Ws
Fig. 10 (a) Panol wih projected structured light and (b) detects in
the final measurement result
Optical Engineering, Vol. 39 No. 1, January 2000 17Chen, Brown, and Song: Overview of three-dimensional shape measurement
Table 2 Terminology.
‘Accuracy The closeness ofthe agreement between the
resul of a measurement and the valuo ofthe
‘quantity subject o measurement, that i the
‘measurand (Ret. 223).
Uncertainty The astimated possible deviation ofthe result
‘of measurement from its actual value (Ref. 224).
Error The result of a measurement minus the valuo
of tho measured (Ret. 223)
Precision The closeness of agreement batween independent
test results obtained under stipulated conattions
(Ret. 223)
Repeatabilly The closonoss of the agreement between the
results of successive measurements ofthe seme
‘measurand cartied out under the same conditions
‘of moasurement (Ret. 223)
Reproducibity The closeness of the agreement between the
results of measurements of the same measurand
‘carried out under changed conditions of
‘measurement (Ret, 223)
Resolution A measure of the smallest portion ofthe signal
‘that can be observed (Ret. 224).
Sensitivity The smallest detectable change in a measurement
‘Tho ultimate sensiviy of a measuring instument
‘depends both on its resolution and the lowast
‘measurement rango (Ref. 224).
‘coupled with projection of structured light by which a large
area may be visualized. However, the retroreflective screen
‘must be rigid, controlled or calibrated without deformation.
‘or movement during the measurement. References 221 and
222 developed a coaxial linear distance sensor for measur-
ing specular shape of a die surface; however, itis based on
point scanning, which is not fast enough for the industrial
application of measuring large die surface. The current
technique for measuring die surfaces requires painting the
surface with powder, which slows measurement speed and
reduces measurement accuracy.
10.3 Shading Problem
There is a lack of research activity for overcoming the
shading problem inherent in tiangulation methods although
some other methods*** besides interferometric and laser
radar show some progress. These methods use a defocus
technique similar othe confoal microscope prinjple and
the newly developed diffraction grating technique.”
10.4 Standard Methodology to Evaluate Optical
‘Shape Measurement System
‘An international standard must be established to evaluate
‘optical shape measurement systems. Important parts ofthis
standard should include (1) standard sample parts. with
known dimensions, surface finishes, and materials; (2)
‘math assumptions and error characterization; (3) measute-
‘ment speed and volume capability; (4) repeatability and
reproducibility procedures; (5) calibration procedures and
© reliability evaluation,
Standard specification/terminology is needed to define
precision/repeatability and accuracy (see Table 2). Often
precision and accuracy are misused with each other. An-
other example is what should be used, accuracy, error of
< 18 Optical Engineering, Vol. 39 No. 4, Janvary 2000
uncertainty. Accuracy and uncertainty are more accurate
and meaningful than error. However, accuracy is more
commonly used both in academia and industry. Many co-
ordinate measurement manufacturers specify a range accu-
racy of +/— x micrometer. Some manufacturers try to
specify measurement accuracy as 1¢, ot 2a, of 30. The
ISOMR 1938-1973 standard employs a 20 band?
10.8 Large Measurement Range with High Accuracy
Most shape measuring systems trade off measurement ac-
ccuraey for measurement range. However, there is an indus
tial need for systems that have a large measurement range
and high accuracy. Further research must be done in this
area, although there is an encouraging report in which the
shape of a 4m wide area of a brick wall was measured
using fringe projection
10.6 Measurement System Calibration and
Optimization and Sensor Planning
System calibration and optimization are key factors to
stretch the measurement accuracy” and to achieve 10~* to
10°* accuracy." Reference 79 shows how to use the same.
system but with optimization to achieve one order higher
accuracy and Ref. 134 demonstrates how to use the novel
self-calibration to accomplish mathematically estimated ap-
proximately 10° accuracy. Sensor planing will help to ful-
fil these goals. Reference 134 also provides a way to elimi-
nate markers using photogrammetry, which is one step
further to make 3-D optical methods be more practical in
industry.
Acknowledgments
The authors would like to express thanks to T. Cook, B.
Bowman, C. Xi, E. Liasi, P. Harwood, and J. Rankin for
providing test setup, data registration and processing, and
test results, and T. Allen for valuable discui
References
1. HLJ. Teas, “Opti! mewology of engineering suraces—scope and
wend.” in Opie! Measurement Techniques td Applicaton PK
Kasogi, Bd, Artech Howse, Boston (1937)
2.1 Moving, Ht Allsto,V. Kolvonen, and ®. Mila
‘sion spe for antoratic model based shape
Jers Bx. 10, 351989)
3. 1S Mass, G8 Baller, A.C. Water, S. Cova, M. Umasutian, and
‘A. Wallace, "Tine of Hight opal singing stem tase on ne
egned sige phon coun.” App: Ope 311), 7298-7304
4.N, Abramson, “Time reconstuction in ligh-inlight ceeding by
holography." Appt. Opt 30, 1382-1253 (1991)
5. T Er Carson, “Menreen of thee dncosonal shapes
lpi tl recording by ology. Opt ng 3, 288)-298
6. B. Nilsson and TE. Carson, “Diet thee dimensional shape mea-
surement by ital lighinfigh bologrpty.™ Appl Ope S10,
5984-7959 (1988.
1. Zand M. C, Lew, “Design of opis uiangulation devices," Opt.
Laser Technol 13), 335-858 U9),
8. CoP. Kefersicin and M Marae,“ Tesing hench for Ise wiangula-
o, Sensei. ey, re a),
‘Mean, “High-precision nage positon sensing methods suitable
fo 3D measures" Opt Lasers Eng. 10, 3-4 1589).
10, Kyene Tecnica Repro Sensor an Mairi aseuneats
it
1H. H. Takasaki, “Moi topography." Arp Opt. 9, 1457-1672 (1970).
12. R Harding ind. Ta Mon iechotgus applied wo atomaed i
Geigy mene "in Poe SME Viton Con Dera,
13, A. Asan: “Computer aided moiré methods,” Opt Lasers Eng 17
T6F=IH6 (i993),‘chen, Brown, and Song: Overview of three-dimensional shape measurement
14, CM. Wong, “Image processing in experimental mechanics,” MPhil
‘Thess, University of Hong Kong (1993).
15, BE. Truan, sFast Inleferometers Bring Precision to Tough Appli-
tations." Photons Spectra, 96-99 (199).
16, ALJ. van Haasteren and H. 3. Prankena
‘easement using w multcamera phase st
te” App Opt. 38(19), 4137—4142 (1994)
17, ML Kejwioska, L_ Sabot, and KC Palo, “Tee chanvel phase
Seppe gem oe motion.” At Op 201, 1635-
1685 (1981),
18, Fe Chen, Y. Y, Hong, and J. Gu, “Dual fringe pater phase shifting
Toveferometty for te dependent phase evaleaon.” in SEM Proc
Vil tne Congress on Experimental Mechanics p. (1992)
19, KBieonn and K. Handing, “3D imaging usiag @ nique refractive
‘optic design to combine moiré and sere,” Proc. SPIE 3208, 2-10,
"Real ime displacement
ing speckle interferon
(hen.
20, Ke Haig and YBa, “Ii ped mor contouring mths
‘ialysise" Proc, SPIE 3520, 27-35 (1998)
21. 1°; Ditekx and W. F. Decracmer, “Video moiré topography for
in-vitro studies ofthe eardrum,” Proc. SPIE 3196, 2-11 (1998
22, Rie noch ME Nia, ang, ad, Mua, “oy,
‘valuation of facial palsy by mow to ior sco
report" Proe. SPIE 2027, 138-141 (1996).
23, Yo, Chol and S.-W. Kim “Phase shifting grating po
tepopepiy.” Ope Brg. 373), 1005-1010 (990)
24, Tivlntumto, ¥. Kitagows nT Miget "Sens vale
tok rina phne sitet" Opt ag. 380) VISA
Treo (990
25. T, Yoshioava and T. Tomisawa, “Shadow mois topography by
saggy fe se st mei” Ope Ee, 320 6rd
i983)
26.1, Yothaawa and, Tomi, Msi po
phase shi method,” Proc. SPIE 1S84B, 441830 (990),
27. SF Cardeos-Garcin 8, Zheng, and F. 2, Shen, “Projection moi a
{ol forthe euomted dtertinaion of surface topography.” Proc.
SPIE 18848, 210-228 (1991).
28, T Macumolo, Y- Riagawa, M. Adachi, and Hayashi, “Laser
Thoké topograyty for 3B contour measurement” Proc. SPIE 1332,
Sto-s3¢ (51.
29, 1° A. Liao tod A. S, Volosia, “Surtacefopography through the
tig cece’ of te stadow mo: SEY Pros 38-50
880).
230, 11S: Lim and M. 8, Chung.
‘Appl Ope 27,2683 (1988).
31. 7 Wang, “imaging laser rdar—an overvion,” in Proc. St nt
Conf. Laser "86 pp. 19-29 (1986)
32. 1° Maron and'R. 8. Schroeder, “Tec dimensional eases inn.
fete Bere evi Al Or sh ae
fs)
33, {-€ Maron and. J, Schulz, “Three dimensional, fine resolution
ina ane fegune eins Oe bei H.-Y
982)
34. 1 Sisley, “Speckle decoration technique for remote sensing
‘of cough objet" OSA Anu Mecing Technical Digest, Vol. 18,
308 (989)
38. LG. Shiney, “Remove sensing of obec shape using a wavelength
ansing ase radar” In OSA Anne, Meeting Technical Digest, Vol
Trap. 54 (198.
36, G8 Vallernan and LG. Shisey, “A compsison of surface conto
measurements boed On speckle pitern sampling and coordinate mea-
Strement machines" Proc. SPIE 2909, 89°57 (996),
37. T'Drewel,G, Haile, and HL. Venze,“*Tves dimensional sensing
oie ses by coe ra” Apps 3, 919-528
)
38, LG, Shiley and G. R. Hallman, “Applicaton of tunable lasers 10
laser rar and 3D imaging,” Technical Report 1025, MIT Lincoln
{ab Lexington, MA (196)
39, Ar ita and. B. P. Hildebrand, “Contour generation by wave
{rout constrvton," Phys. Let. 19. 10-11 (1903)
40, Ko Grea YY, Cheap, and J, Wyant, "Contouring aspheric surface
Ting two-vavclengt phase shifting inieeromeuy,” Opt. Aca
$20), 1455=1468 1985)
441, RU. Tatum, 3. C. Davies, CH. Buekbery, and J. D. C. Jones,
“Sbiograpi srface contouring wsing wavelength modlation ofl
ser diodes!" Opt. Laser Technol. 22, 317-321 (1990),
42, TManck G. Roi and W, Schreiber, “Theee coordinate measure:
‘Beat ofan obec strface Wik a combined two wavelength and ¢80-
Bare phase Sufing speckle interferometer” Opt. Commun. 11,
$576-584 (1995).
43, Y Vo. Kondo, Ohya, T, Honda, and J, Tsjuchi, “Measure
Inge nse ce ya ang inereomer An
fol. Sin 92), 120-123 (980)
Zeina and JR. Vac, Matipleindex holographic conus
tng,” Appl Opt 8, Vast -1434 (1969)
45, YOY. Hing. Le Tomer, M. Terlian, J.D. Hovansian, and C. B,
raphy with the aid of
“Mois topography with color gratings.”
46.
4
48,
st.
52.
33
5s.
6
58.
65.
6.
6.
7.
1.
n.
n
1.
1s,
16.
n
“Taylor, “Optical method for meaarng contour slopes fen object.”
‘Appl. Ope I). 138-131 197).
17 Chen, eno resign of he ning ne
Polling using speckle potgrephy technique” Ope Cama.
Ssa38 (997
Ni Abcamon, “Holographic contouring by tansltion,” Appl Op.
15, Hob -1032 (1976
© soceahan, B, Frat, P. Haile, nd J. Taian, “Contosrng by
Sesonte speckle ptt interferometry wing dul Bear tami
Sen pot pe 2 905 1911 (90,
Pri Rast dd Lag, “A Bolographic techniqe etuing broad
Fae ens to comer dfs jects". Med Opt. 38,1673
{abs i991.
R Radefger-Vera, D. Ker, and F. MendoraSantoyo, *Bletronic
‘Specie contouring’ 1. Opr Sor. Am A 9(), 2000-2008 (1992),
Tse Wang ad S. Keishnaswany. “Shape, measurement using
iditivesubsactive phase shifting speckle lsterferomeuy,” Meas.
‘Sei, Technol. 7. 1748-1754 (1990).
all Poche, 8 Kren, and HJ. Tian, “Double etro-
Ae nytt fr high pression dsiance measurements” Proc.
SO ee .
3° Mellor, “Ushi resoluion imerferomeiy,” Proc. SPIE
Jaa TES os
TR isang and K.P. Tatam, “Optoelecvoni stearography: v0
stile teams” Pie SIE ssa, SS 125)
EE Siten, Feng and Fe Cha, hes dimen shape
erat ng ita tap Pre PIER
bos
Cre tay, HM. Shang, A, Pop, and M Luo, “On te determina
Mion af ope by sheaograpy.”” Ope. Lasert Eng. 20, 207-217
(i398
ED, bewin and D. A. Lyon,
tion gatings:" Appl Opt 23,
SDE Dita and B_ A'L3en, “Mls.
inensionldgting ih acon pce Ope Engin Os ise
SiBetbacher Oven, and We Joiner, "Meas Shape and de-
Siete of small stectsesg Sigial bologaphy.” Proc SPIE
$379, 104-115 1998.
1. Wagner, W. Osten, and, Seehacher, "Direct shape measuremeat
5 ha eon fcosrcion ad wavelength sig Ot
1 tissue.
Girt and F Paz, “Conoscopic probes are set to tansfcm indus-
{al metclogy,” Sons, Rew. 18(2), 108-110 (1998).
We Wes inghns, Analisi aontpogratis pha
2nd ed American
tug. 1SPR8 Cone Com V Roto 2B
BEAT inna Now ographic Phowogrme, ode
Sid of Rogan wl Rete Si Fa htc, VA
ft
een and H. Kalen, Bd, Opti! 3D Meawrement Tech
ies ticks G03)
RPG aul Kaine, a, Opal 3.0 Measurement Teh
vipat Pr Ship Sas (5)
Betta Sal Kanes, Opal 3-D Measrement Tec
tous, Wisinans 58
Eee nal of Pram i el, Anesicn Secity
Si Poaoeahmety. Fala Chath, Va 080.
Ree, Franoe meastemen fo one pia a milk
{er Brogan Eg: Renae Sens S89), 305-56 (990)
Wrcieeenib Netlist of rate Garces) wtzing he
Sia S1 Baer wang syuoms" in Opie 3:0 Menuet
Techniques I, Wichmaon 195).
SiR Lose and, Wen “Automated part posoning with
th ey ante Beta Tees ecko C9,
SESSA. C's aod Me isnn “Anaad phase mex
Nesstiicnety of 3 due object Appl Ope 3, 8105-3105
(isa
SP anio, RC. Kim and S.. Cate, “Tre meson inne.
tah annie acted git* Ope Bag, 00, 966-978
(0985),
E, Wal, “A coded light approach for scquisiion” in
Dros, Muskerevkennung. 86, Taformaile Fachberichte 125, Spinget-
Verag (1986.
SToyoka aad Y, Inna, “Automatic of 3 iffine
Sich by patil phase Seton," Apy. Ope 25, 1680-1683 1986).
SP Sjodh! and. Syonergren,*Mewtremen of shape by wing oro;
‘Pac! ran pee anf tnporal dig specie photography.”
Jon Op. 3810), 1990-1997 (059),
‘Sue Tig ore tances of mii
‘lew eomcry ands splcaloa” in Proc, I. Workshop om Al
Poppe forte Meer eo Cle i ey Be.
tin
Sr Adan and A Shaan, "Novel view sytess by casa
cg tyson” TEBE Tras, Vawal Comput. Graph 4), (1998).
‘RGhashon snd M. Werman, "On te tinea texsr of tee per
sje vit an i ding acy.” i Pro. Coon
pus Vision, Boson, MA Guns 195)
Optical Engineering, Vol. 38 No. 1, January 2000 19Chen, Brown, and Song: Overview of three-dimensional shape measurement ..
79, C.R. Coggrave and J. M Haney, “Optimization of shape measur
tment system boned span git molars.” Opt Engen tis
80, H. Gannr,P Lalo nd H. J Tian, “New, high efficient, i
Codes for suvcted igh metas” Poe SPIE B99, €19 (995)
81, © Mule," as tee Smensonal frm neattement system.” Opt
Eng Mi), 2758-2750 (1995)
82. G!Ssnson § Cosas, azz, R. Rodel, nd F Debi, “Three
slinenstona imaging based om gray-ctde light projection character
{ziton ofthe mesg ler and deelogmen of essa
system for industrial apteaton, "Appl Ope. 36 8465-4872 (1997.
83, Retard and G laser Acq 3B data by foes sen
dpe Appt Ope 27, sto (988)
#4. A Sonuno era, C-M, Hiofosy 1. G. bara, and. Arion
‘acovery of te dsesional shapes yang wdctocas src
ph sytem" Proc SPIE 383), 80-8 (O98).
85. M, Takata, T. Aoki, ¥. Miyamoto H. Tanaka, R, Ga, and Z. Zhang
“Apso ite dimensional shape measurements ing cosa of
tial system wih «colmage plan for peeon tnd bscralon*
pe ge i °
86, HUT Tn, “Optical iechnigucs for shape measorement,"in Proc.
Zed, Worksop on Teoaie: Poceang of Pace, Pans
Fringe 93, W Soper and W Osten, Epp N0S-138, Akar
eras, Bern (193)
97. K Leonhard. Droste, and HJ, Tan, Microshape and rough
Suitace analysis by fringe potion,” Appl. Ope 8 7477-768
(98,
#8, 11 J Fein and H, M. Ube, “Three dimensional image sensing
igo cone meray Ap On A aa
1939)
88, Metis, RS. Keisnamuthy, HC. Li, and FP. Chang, “Aue
tomate 360" proflometry of 4D dias sys,” Appl Opa,
35396 ase).
oo. 30. Cheng, 3. Sa, and LR. Go, “Avtomated measurement
Meth fer 360" protomety of ine jess” Arp Op
Taraci278 (loon)
91. H Ohta H. Koro, M. Sasaki, M. Suri and K. Mura, “Auo-
‘ated 360" peoflométyof tio dimensional iff eject and
‘eeomsraction by ae of the sbading modal” Appl: Ope 36, 437
92. A. Asund, C. 8, Chan, and M. R.Ssja, 360° proflometry: new
tee fer dpley and aun Opt xy 8, Sel)
i988)
93. A. Asindi and W. Zhou, “Mapping algorithm for 360-deg prof
ey with ime delayed ‘and inerracton imaging." Ops Eng 38,
$33.34 (199)
94, € Reich, "Photogrammetic matching of point clouds for 3D mea-
surement of complex abject." Proc SPIE"3520, 100-110 (1998.
95, FLW. Matz, "High dynamis codes, self ealibration and autonomcus
SD sensor ocenation tee steps towards fast optical reverse gle
iering without mechanical CMMs," In. Optical 3D Afesurcment
Techniques Ip. 194-202, Wichmann (1995).
96. H.. Thi flgh Precision Sirface Topograpy Measurement,
Optical 3-0 Measurement Techniques It Se Crue sod H. Kabact,
bee Wichmann (1985)
97. S. Kakunal T. Sakamoto, and K wats, “Profile measurement by an
sctve light source" in Pro. Far East Cont Nondesucive Testing
land the Repube of China's Society Jor Nondestncive Testing Ok
‘na Cont pp. 231-243 (1994)
98, Yo Arai, 5, Vhozel and. Yamids, 3-D automatic precisign mea-
suierent sytem by igid crystal plate on mou topopghys* Proc
SPIE 1854b, 300-274 (OST
99. Sasa Dest, , Mina: and C- Buses, “Deyapment
and characierzaion of a figuilerysal projection unit for adepire
Structured umanation,” Proc. SPIE 164s, 98-86 (1091),
100." Asund, "Novel grating methods fr opical inspection,” Proc
‘Spit 18849, 708-715 (9D).
Hol YY. Hang an Chea Ship meareomens wing dino mot
‘tha teaching LCD,” Oakland University Technical Report (191)
102, ¥"¥. Hung and F. Chen, "Shape messuement by phase sa Sous:
tured light metbod with an LED,” Osiland University Techies
Report (1993).
103, ¥-R" Hung, “3D machine vision technique for rapid 3D shape mea-
scent ana sora uaty inspection,” SAB paper 19D 1-218
10, H. O. Saldner and J, M. Hunley, “Proflometry by temporal phase
Unorapping sed spatial ight medalatr tase rng projector, Ope
Eng. 8012), 610-815 (1999),
105. G: Frankowslt, "The ODS 800—a new projection unit for optical
amatogy." in Proc Fring °97, pp. 532599, Bremen, Akademie
Verlag. Besta (1997)
106, JM Hide and H. O, Sldner, “Eror reduction methods for
shape measurement by temporal phase unwrapping.” J Opt. Soe
dime '14 12), 3188-3196 (199. ”
107, B'S" Huang, Q. Ho, FT, and F, P, Chang, “Colorenkanced
20 Optical Engineering, Vol. 39 No. 1, January 2000
igi foge projection techniga for high peed 3D surface com-
totring,” Opes Eng. 3805) 1068-1071 (1999).
108, KG. Harding, M. P. Coleta, and CH. Vandommeten, “Color
encoded mote conourng” Proc SPE 00S, 169 (1988
ICA. Andrade, B.S. Gilder, 8. Call, Koza and J Blat,
“Realtime epialy processed target recoitonspom baied os
ativay most contour Proc: SPIE B84 170-190 (1090
110, C'HL Hu and YW Qin, “Dill color ending ands aplcation
{the mote echnigue Appl Ope 36, 3683-3685 (999
111, EM. Dest, Time color tferenial intererometsy.” Appl. Ope.
36, 7150-7156 (997)
112, $ Kalama. Salam, and K. Fests, “Profle measurement taken
wituigaderystal gratings Appl Op. 313) 2838-5828 (1999),
113, CIM. Brown nd TE. Allen, “Masten of structure 3D shop
‘sing compater aided bolomey.” Ford Research Report (1952)
114, CE Buckbery,D. P. Towers, B.C. Stockley, B. Tavender, MP
Jones, J.D. € Jones, and J.D. R. Valera “Whale fe opal
isgnostics for srl analy is inthe atomosve lady.” Op
eters Eng. 5, 435-453 (1980)
115, ¥.¥ Huns, “ee dimension! computer vision echnigas for fll
Shrfoce shape measurement ad surface aw inpeson* SAE pape
‘S206 (192)
16. M$; Mermeiscin,D.L.Feldkhun, and L. G. Shirley, “Videorate
surface profiling with acoeste-optic accordion fringe inteerom.
sty Opt Eng inthis se.
117 Eis fe
U8, Re Lent and U, Lenz,
aeguision with CCD wea seman” Proc SPIE 2053, S3-63
G53}
119, M Ulm and ©, Pass, “Relative camera calibration from seo is-
Parcs” in Optic! 3D Meaarcment Techniques pp 336-59,
Wichmann (195).
van, A AUG oma MH Shinbiy,P A. P Selo an
M, de Aimeide Nogecin, “Phoogrammesc range sytem: math
matical model and alban,” in Opicl 3D Measrement Teck.
niques I pp, 391-a, Wichmann (995)
121, Fe Water. Weekes, end Re Dillmann, “Calbation of the
scaive ste vision system RASTOR with standardized perspective
Tatra," Proc. SPIE 2282, 98-105 193)
122, W.Hoinger and HA. Beyer, “Charscteration spd calibration of
45.VHS camcorder for cgi photogrammety.” Pros SPIE 225,
133140 (1993,
123, RG. Willson aod S.A, Shafer, “A perspective projection camera
‘ode for zoom lenses" Pro. SPIE ASE” 149-138 1999)
124, RLY. Task, A vera camera albaton techie for ph accu
oy, aD sachs vison cology wy ff the shell TV eames
ease" IEEE I. Robor. Autset, RASA, 1987)
125, HLA. Beyer, "Geometi nd ragomeat nas of « CCD camera
based photogrammetic sytem,” PaD Dasetion ETH Ne 9101
Zar ay 1999)
126, HLA. Beyer, “Advances in charcteiation and calibration of ig
‘pauper Ian ack Phtoqron. Remote Sos 2919)
‘Sasests lon
‘New developments in high elution image
127, 8. X. Zhou, R. Vin, W. Wang, and W. G. Wee, ‘Calibration, p
‘uneier estimation, ad accuracy enhancement ofa 4D camera rt
table system” Opt Eng in ht sae
128, P. SuintMare, 3, [ Jezotin, and G- Medion, “A versatile PC-based
faagesiodng sysem." IEEE’ Tra. Rob auton. 12), 280-236
18).
19, Pan nt, W Sg nf W. Oto “Erosion
afahen Aa, ad One Pn of
al ee ee hm
10 ae ae EE ap memnet
fig Mya ie eS
13, AAS nd W. Zu, Ui arson cs ania
ia eta gr long” Ape Op sO
56-3561 (1999). pon
toa Yvan! i HM, Shang. dB, Pl he
imeaionl compe ign ets fr falta ae es
fecnon” Op Ban
133 Kowa Kate Ger, W. See, ad
Romi “Adee opal tredimcnsoal mesacnen ihanes
ted pe nthe
134, Weare fod G Neher and arangsments of el
Saitatng wholly te-nenind esacsio ys
Ineo con tcc” Oy Ege
tas, RAWARE 20st en epost fr ph pon” in
Prot Masrremunt 86 Iori Perce BS Spgs
Weta 8
136. RZ: “Aue ft pe determination of die r-
Astin es och ne yma ce gd ih
[pte meanuenen i ro SRS mms Cn Sy
Fal rocsing of Phowramneric Dane, Swland
(5s
137 $a tat, 5, Ssto, 7, Slant, “Proie measureus,
M6,
ur.
8.
19.
150.
151,
12,
153.
1s,
158,
156
137.
158.
159.
160.
161
12.
163.
168
168.
166,
167.
. H Zhan, Chen and. Tan, “Phase unraping
Chon, Brown, and Song: Overview of tree-dimensional shape measurement...
‘ment by two-pitch erating projection," Int. J. Jpn. Soe. Precis. Eas
‘58, 877-882 (1992),
Matz, "Adapive light encoding for 3D sensing with maximum
fnewsuremeat efficiency,” in Proc: J1th DAGAT Symposium, Ham-
‘rg, Informatik Fachberichte 219. Springer (1989)
1G, Shiney, “Tiree dimensional imaging using accordion fringe
terieromety," to be submalted fr plication.
HH Steinbishler, "Method and spparats for ascertaining the absolute
coordinates of an abject,” U.S, Patent 5,289,204 (1934).
Grindchciouw, “Profle measurement tsing projection of running
Iinges." Appl: Opt. 17(18), 2930-2933 (1978)
i'Nie, M7. Lalor, D. R’ Burton, and MM. Shaw, “Four map
soe dane earn," Op Eng. 60), 2817-2520 (1997.
W. Nadchorn, P--Andra, and W. Osten, "A fobust procedure for
Ahsolvte phase measurement Ba 245-260
(990),
Opn Lasers Ene.
iosthm for
“appl. Opt
the messuremen of tee dineasonal objet
‘S3e20), 44974800 (194).
MAK. Kalns, W. tune: and W. Osten, “Automatic adation of
Projected finge ‘eng 9 programmable LCD projector.”
Bro. SPIE 3100, 156-165 (1997),
41'S Huy and HO. Sader, Shape messbrement by temporal
as nwrapping: comparison of unwrapping sigh.” Meas.
1 Technol 89) 985-992 (1997.
TE. Sones and J. M, Haney, “Temporal phase unwrapping:
gpm ace ping of dicomntos eject.” Arp Oe
fi zr
TEM, Huntley and H.O. Saldner, “Temporal, phase unwrappin
Aig for utomata Inetrogrin snl” Appl Ope SEQ,
3087-3082 (1983),
M. Takeda, Q: Go, M. Kinoshita, 1. Tala and Y. Takahashi, “Fre
feney mulupee fourier ansfoom prolomtry single shot three
[tseesional shape measurement of objects wih lrge height dscon-
{tates andor bufae olan,” Appl: Op 8622). 5347-5354
tion
WP Pada and H. Yamamoto, “Fourier transform speclepoflo-
teeny vec dmesioal shag meseuement of Ofte cece Wik
ing inuht Sops anor sally oad sufaces" Appl Ope 33,
als abr G3),
S Rowan snd, Yamaguchi, “Wavelength samsing proilon-
Set sutate tape measurement Aap Ope Se TS
Bese.
FSP. eranze, and P. Hable, “Wavelength shit speckle
Ticteremc for abla prodlomey eg ae hop fee ex.
tea any ioe ser" Mod. Ope 44 1885-1850 (097).
FE ATG hen and C. riten “Milupe wavelength ac
Tus for shape mensurcent* For Tecial Report (1959
eee’ ance Pr Hable and Fl Toten, “Shape mea
SGremcnt by uss of enna Four tanfonation in da bears
ignite rc A Os 388-9
SP kim. 3.1. Ob, MS, Jong, and. B, Cho, “Twosequency
Bia Sing proscton tone opogsantns” Pron” SPIE" 520,
SoS sae)
# blen M. Cane, HJ, Caulfiel, an S, Ezekiel, “Absolute dis.
feo menurmpent variable wivelngi inefoomesy,” App
‘pe 20.) 400-203 C981)
PMG PH. to, and 8 Chang, “Determination of the absolute
Sos of shadow mode tines by ting wo dite colored HEN
Speen ppl, Ope 341) 130-159 L998).
PYsmeacth A Wamamtio, and 8. Kuwarra, “Shape measure
samen watace by weveengh canting” ange 7 Ae.
ate Pen of ip aoe.“ Alnor,
ein oT
Wik: Nader nd. Ande, “Generales ap
vouch abit phane messienen” Poe SPE 2860, 2-13
099),
2.0) "ap aD, 2. Yan, Finger eet of show
‘ro wih ag Gangs” Dalen Fesbnlogy Univers Tech
‘Report (982)
Fee ens BZ. Yun, “entiation and division of finge orders
Epes lneerovcty. Balan Technology Unvcraly fecal
Repo (1980)
FGien and 2, Yor, “Mean and division of fase orders
In specks metadata Opt Sin M3), 05-408 (987).
| Aveelengt meats fr phoocatihy
BA Gust "Uae of senshi vector vation wo determine a
Kite aopeerents due posure hologaiesermesy*
‘ppl Ope. 3940, 302"805 0950}
WPS end 8, Siti, “Dirt extction of ps radon
Heaee nati dem ant ase sop fangs ponemse” APPL OP S90),
ote 000 (998)
J XScows and, Ser, “Complex phase tmcking method for
finge pater analyse” Appl. Ope SB), 2256-2268 (109,
EE ec ft eee matching of ISPS WG Hs,
168.
168.
170.
m,
1m.
13.
174
1,
16.
17.
178.
179,
180.
181
182.
18,
1
1.
186. D.
187. b.
129.
190.
191.
192.
198,
9s,
196.
i.
198.
ImthreKPARS, Congress Kyo, Com, I, 213), 254-271 (1988).
MAAR. Settee ‘on stenting technics” Larch
Phas Congress Kyoto Com. I, 2103), 11-23 (88)
RAY. Gate, “Coomera Connaned mip mathig,”
Protea Eng Remote Sens SUS), 633-681 198.
Ren Melonen HX Lo, ana DKonash, “Epos geom
dy for inal ee angescsors fn Proc, Bru Machine
Bon Con BUAVESO, pp 15-28 (1950,
KG Anal Bion for Mobile Rbots: Serco Vision and
Butters Prion, 4, fe i es 5
ciple, A gloat pprsch fr leat squares mage matching and
ferack sscogatien tn Sect space Photogramm: Eng. Renate
Seno $83) 817-323 (995).
Te sen, tpn of pi meas in
fideometny." Pro, SPIE 2288 165-178 983).
TER, NR. Cooper nd 1. G.yer “An estimate for
dhe tandom grr inp pel agt locaton tn sso the bundle
Cie a ea ie
plc “aw ftgal approach to dial age matching and
Gj Retiace ecnscise” a Optica 3D Hcsuroment Tok
sme. pp. 389-559, Wich (1980)
Bete, cy vs tase Yor vision sytem were Sens. Re.
fs) 178-189 (1988).
EW! Maz "igh dynamic codes, self calibration and autonomoes
5 senor intone eps tres fst opal eves ene
enny vies mechanical CMs" in Optical 3D Measurement
‘ebedes pps Yee ata, What (8)
Reds gammctie matching of peat leads for 3D ma
Screment of complex eect” Proc S712 3520, 100-110 (1988).
Bre Goer meology in manufacturing somaton,” Ses.
or M8), 1287 (958)
Cray end 6-1, Vickers, “Automated pid prtoping wie
Thine usc tang and tee form aching” Ane CRP 2),
7S (990)
Ml silon Dek Weir, Bradley, and G. W. Vickers, “Reverse
MaMa, employing 35 se stoner 8 ae ety in Ae
Stbnace ecto! 12 111131 0996)
Dec Mt Moy, C. Badly. and G. W. Vicker, “Revere
incering payed ols employing wraproané B spline sr.
Head den" Poe st Meck Eng B20, 127-157 (1990).
RoNigpld's Skane. Suor and Sh, “ison semor sep
Bee oes bandage sytem unig envionment model ie
roc Set fin Corr Eng. Jpn pp 1037-1080, shia,
Spa Gal, B87)
SPSiuie, Ne Ii, and ML. Kakkar, ““Oslusion avoidance of
Sjneteebors bused"on 1 tand eye action Simulator system:
HEAVEN.” aa oot 20), 9-105 (1987
BSS Sin end M, Rha, "Panning foes of tenons
SS einaT feedback comme” Tron, Soc: Ininm Control Eng
Blige 619 dune 1988)
coe ik entation method fo ental projection pro
fam." Compa. Graph 1, 35-37 C982).
Andon, Ein igre fr aticmai viewer veats-
Aton” Comput Graph 9), 407-413 (985)
"Gor tote based emesis sensor locaton.” i Prog
5g TE a Co abot and maton 90-38
ioe
Ce cowan and A, Bergman, “Determining the cera and ht
Source Toston fox vse ti” tn Pro 1089 TSEE Yt Coon
Robotes and Automaton, p. 508-54 (989).
eee dnd BD. Kees “Avtonate ssc placement rom
ison wk reautemeny” IEEE Trans Pare An ach, nell
Wo) sorte Goes.
A Recanis Tea and, Abrams, “Panning viempins tat
inulancoui say sever ete detecbiy constant foro.
be i i he Coo Aedes Tea
“on.
PPesanis, BY. Ts, and PK. Alen, “Antomated senor pan
Tog for wie son lke” tn Prog 7907 TEBE ta Conf on
Robotics end Automaton pp (Ap. 1990.
MCL Roclows. aod K. Wells, *Semsor simulation in
Moet applctions” io Brot Advanced Rabon Program. Work
ete Mpultors, Sears and pe Towards Mobily. Pp.
TP Sy Wenkear Research Con, Katrabe, Germany (1987).
A teed aed 1. Rober,“ Moeig tensor decay with
Be WASTAGE eeomcttccasor mode IEEE Tront Rob.
fom 11-784 U9).
‘Racakomsty and Kt Mitenbuelr, “‘Simelon of ereas in
18x aplcatons” Comput Graph Appl, 16-23 Gan. 1989)
Ba Becheiee smarting vison a Al for ings apica
2c iment Rebot d Computer Vion VIE Syston and
Aopieatons Poe. SPIE 1198, 295-005 1989)
Pe ether. A Hill aid D.C. Hodgro, Awomated Visual
Taptcton Sad, Bed, gland (985).
{PR St, end Tamora, “An expert system fr ind
‘al aching vison in PY. 100 nt, Conf on Patter Recap
Optical Engineering, Vol. 39 No. 1, January 2000 21(Chon, Brown, and Song: Overview of three-dimensional shape measuroment..
fon pp. 771-7, IEEE, Alte City, NJ (Jane 1990),
199, ANovini “Lighting and opdcs expet system for machine vison’
in Proc. Optics, Iunination, and Image Sensing, pp. 1008-1019,
IEEE (1958).
200. 41 Spy and A. G. Rey, “Acces mals for he
‘tomate faspection of mechanical parts by cova meat
‘machines," i Proe: 1990 ICRA, pp. 1284-1289, IEEE (1990).
201, 1.1 Mundy, “Tndustal mache vision is ft practical? Stuck
Ea, in Advances in Digital Image Procesing, pb. 235-25, TE
Plehum Pres, New York (1979)
am ORME ae 1G tpi, Axiom er and
Hah Source positioning for machine vaio,” Proc 10H It Cont
9h Pater Recognition, p. 35-58, IEEE. Allantic Cy, NF (1990,
20s Hee Gite, i Alea 0M roe “Mee:
‘eat of spe and vibration utng single elecrnic speckle inte
feromety," Proc. SPIE 2860, 130-161 990).
204, F Chen €. 7. Grifen, and N. Aroon, "Fast pant defect detection
sng sted light and Fourie tansfonm metho," Fond Tech
‘al Repet (1935)
205, BH. Zhang and W. W. Zhang, "Nondestructive profer for
{noes wall using angulation seaming mthod.” Proc SPE Wo,
‘6c79 (1998).
206 Ga SW. Pale an Li “Apion of pe it
‘pte rangulation to precision gear gauging” Proc. SPIE 3530,
S63 (1998).
207. GL. Hobrough, “A futare fr realtime photogrammetry.” Ver
mess Phorogranen. Kulurech 9312-315" (1985)
208. T. Ciak and Re Gooch, “Real tine 3D) metology for aerospace
manufacture," Sone Rev. 19(3), 13115 (1999)
209, 8.1. Gordon and F. Benayad-Chart, “ADI inen-
real ine
Sonal imager,” Proc. SPUE 2348, 21-226 (1995)
210. EN. Colgan and R. Jin, "Obutning 3 mensional shape of
texture and specular surfaces using foursource photometry." Com.
(Graph. Image Process 18, 309-308 (1982)
211. F Solomon snd K, euch, “laspection specular lobe object using
four light sources" in Pro. IEEE Conf Robotics and Automation,
BP 01713 Nice, France (1992)
212. Gl Healey and. ©. Binfor, "Local shape fom specular. in
Proc nage Understanding Workshop, Wal. 2, pp. 814-87 (O81)
213 9K Nie AC Stine, CE Wei id BD, Soon
{Specular sitace inspection sing srocre highlight and Gu
Jan ingges" IEEE Trane Ro. Aion: 60), 208 26 (obs
2 6 Seco, We nd 6 Na “Stato Nk
ligne inspection of specular siraces:” TELE Trans Pater,
‘Mach, tne PAMISO(), 44-55 (1988)
22 Optical Engineering, Vo. 39 No, 1, January 2000
2S. SK: Nye “Shape covery wing phys! models of refeton
{od intreccion* CMU (51h = PO
216, ¥'V. ang, F Chen 8H, Tang, snd J.D. Hovanesan,“Rective
somputr visi tecingue for enoring sas Hope Sate
elormation," SEM Pro. (1993) mae
217. ¥ Ye Hin, FC 1D Hoven, “DET ud eine
comput visten wha for tear pla Seo
‘SEM Proc. (1994), 7 ea
218, R.Holing Aswend, and. Nevgebaver, “Phase reflection —a
‘ew soto forthe detcon of ape defect on ct bl oc
Op Bn nts se
219, X hay and W. BT Noah, “Resolve grain geertion
ad snl forsaface meter 316) tone
2627 (1998). praiareas
220. X Zhang ad W.P.T. Nor, “Analysis of 3. sure waves on
andar fans by roc menlogy Ope Ba ee
221, “Oita University esearchers develop non-contact profile tensor"
‘reported in Photonics 46 (Feb. 1997). Me
azn. TC Ryu nd 8G, “New opel sein syst or aii
mensional shape of sprlar ons Ope Big age
‘1483-1495 (1996). eer foment
223 B.N. Taylor and, 1 Kaya, “Guldtipes for evslasting and ex:
asin th uceulay of NST mesons euloe MRT RSS
ial Nbc 2, US Goverment ning Oe, Wann DC
224 Low Level Measurements Handbook, he, Keidey nronents
aos Been OH 0) ;
. Bao, “Revenecepatrng wing laser metrology" Sen, Re.
18(2), 92-96 (1999). . “
226M ithnaa, -Toegan and M, Fs “Shape measurements on
large suraces by Hinge projec” ip Teoh BAO) SI gd
Ai)
Frank Chor
Biography appears withthe quest editorial inthis I~
‘Gordon M. Brown: Biography appears with the guost editorial in
‘Mumin Song: Biography not available,
You might also like
Dynamic 3D Shape Measurement Based On The Phase-Shifting Moiré Algorithm Canlin Zhou, Shuchun Si, Xiaolei Li, Zhenkun Lei, Yanjie Li, Chaorui Zhang
Dynamic 3D Shape Measurement Based On The Phase-Shifting Moiré Algorithm Canlin Zhou, Shuchun Si, Xiaolei Li, Zhenkun Lei, Yanjie Li, Chaorui Zhang
14 pages