[go: up one dir, main page]

CN107923736A - Three-dimensional measuring apparatus - Google Patents

Three-dimensional measuring apparatus Download PDF

Info

Publication number
CN107923736A
CN107923736A CN201680046181.9A CN201680046181A CN107923736A CN 107923736 A CN107923736 A CN 107923736A CN 201680046181 A CN201680046181 A CN 201680046181A CN 107923736 A CN107923736 A CN 107923736A
Authority
CN
China
Prior art keywords
light
image data
light intensity
intensity distribution
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201680046181.9A
Other languages
Chinese (zh)
Other versions
CN107923736B (en
Inventor
梅村信行
大山刚
坂井田宪彦
奥田学
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CKD Corp
Original Assignee
CKD Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CKD Corp filed Critical CKD Corp
Publication of CN107923736A publication Critical patent/CN107923736A/en
Application granted granted Critical
Publication of CN107923736B publication Critical patent/CN107923736B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Three-dimensional measuring apparatus is provided, when carrying out three-dimensional measurement using phase shift method, measurement accuracy can be significantly increased.Base board checking device (1) includes:Lighting device (4), the candy strip predetermined from the surface projection of oblique direction printed base plate (2);Camera (5), shoots the part for having projected candy strip on printed base plate (2);Control device (6), implements the various controls in base board checking device (1) and image procossing, calculation process.Also, the candy strip on mobile projector to printed base plate (2), and shoot the candy strip of the movement several times, for each pixel by the added luminance of each pixel of a series of view data of the shooting, calculates its average value.

Description

Three-dimensional measuring device
Technical Field
The present invention relates to a three-dimensional measuring apparatus for performing three-dimensional measurement by a phase shift method.
Background
Generally, when an electronic component is mounted on a printed circuit board, first, cream solder is printed on a predetermined electrode pattern disposed on the printed circuit board. Next, the electronic component is temporarily fixed on the printed board based on the viscosity of the cream solder. Thereafter, the printed circuit board is introduced into a reflow furnace and soldered through a predetermined reflow process. Recently, it is necessary to inspect the printed state of the cream solder at a stage before the cream solder is introduced into the reflow furnace, and a three-dimensional measuring device may be used for the inspection.
In recent years, various noncontact three-dimensional measuring apparatuses using light have been proposed. Among them, a three-dimensional measuring apparatus using a phase shift method is known.
In a three-dimensional measuring apparatus using a phase shift method, a predetermined fringe pattern is projected onto a measurement target by a predetermined projection unit. The projection unit includes a light source that emits predetermined light and a grating that converts the light from the light source into a stripe pattern.
The grating has a configuration in which a light transmitting portion for transmitting light and a light shielding portion for shielding light are alternately arranged.
Then, the fringe pattern projected onto the measurement target is imaged by an imaging unit disposed directly above the measurement target. As the imaging means, a CCD camera or the like composed of a lens, an imaging element, and the like is used.
Here, the following techniques are known in the related art: the fringe pattern having the rectangular wave-shaped light intensity distribution converted by the grating is projected onto the measurement object with the focus shifted, and thereby the fringe pattern having the sinusoidal wave-shaped light intensity distribution is projected (for example, see patent document 1).
Under the above configuration, the intensity (luminance) I of light of each pixel on the image data captured by the capturing unit is given by the following expression (U1).
Wherein, f: gain, e: deviation, deviation,Phase of the fringe pattern.
Then, the grating is controlled to move so that the phase of the fringe pattern is, for example, 4 stages Varying, successively introducing a beam having an intensity profile I corresponding to these0、I1、I2、I3The phase of the image data (2) is obtained based on the following equation (U2)
Using the phaseThe height (Z) in each coordinate (X, Y) on the object to be measured can be obtained based on the principle of triangulation.
Prior art documents
Patent document
Patent document 1: japanese patent laid-open No. 2007-85862.
Disclosure of Invention
Problems to be solved by the invention
However, unlike the case of focusing, it is very difficult to maintain and manage the focus misalignment of the fringe pattern in a desired state, and the light intensity distribution (waveform) of the fringe pattern projected on the object to be measured tends to be disturbed, and there is a possibility that the light intensity distribution does not become a sinusoidal light intensity distribution.
Further, since the focal point shift of the fringe pattern differs depending on the relative positional relationship with the measurement target object, the light intensity distribution (waveform) of the fringe pattern may change when the relative positional relationship with the measurement target object changes.
Further, since the focal point projection is shifted, it is also impossible to project a fringe pattern using a telecentric optical system.
As a result, there is a fear that the measurement accuracy in the three-dimensional measurement is lowered.
The above-described problems are not limited to the measurement of the height of cream solder or the like printed on a printed board, and are present in the field of other three-dimensional measuring devices.
The present invention has been made in view of the above circumstances, and an object thereof is to provide a three-dimensional measurement apparatus capable of significantly improving measurement accuracy when performing three-dimensional measurement by a phase shift method.
Means for solving the problems
Hereinafter, each technical means suitable for solving the above-described problems will be described. In addition, the special action and effect are added to the corresponding technical scheme according to the requirement.
Technical solution 1. a three-dimensional measuring device, characterized by comprising:
a projection unit having a light source that emits predetermined light, a grating that converts the light from the light source into a predetermined stripe pattern, and a drive unit that can move the grating, and capable of projecting the stripe pattern onto an object to be measured (e.g., a printed substrate);
an imaging unit that can image the measurement target on which the fringe pattern is projected;
an image acquisition unit that controls the projection unit and the photographing unit and can acquire a plurality of image data having different light intensity distributions; and
an image processing unit capable of performing three-dimensional measurement of the object by a phase shift method based on a plurality of image data acquired by the image acquisition unit,
the image acquisition unit
After acquiring one of the plurality of image data,
a shift process of shifting the grating is performed, and,
a photographing process of continuous photographing (exposure) is performed for a predetermined period at least partially overlapping with a moving period of the grating,
or,
the photographing process of photographing (exposing) is performed a plurality of times over a predetermined period at least partially overlapping with the movement period of the raster, and the process of adding or averaging the photographing result (luminance value of each pixel of a plurality of pieces of photographed image data) for each pixel is performed.
According to the above-described technical means 1, a predetermined fringe pattern (for example, a fringe pattern having a light intensity distribution in a rectangular wave shape) projected onto a measurement target is moved, and the moved fringe pattern is continuously captured or captured a plurality of times, and the captured results are added or averaged for each pixel.
Thus, after one image data out of a plurality of image data having different light intensity distributions required for three-dimensional measurement by the phase shift method is acquired, image data having a light intensity distribution closer to an ideal sine wave can be acquired as compared with a case where only a predetermined fringe pattern is projected and imaged.
Here, the "sine wave" refers to a "sine wave shape", and when simply referred to as "sine wave", it refers to not only an ideal "sine wave" but also a waveform approximating a "sine wave" (the same applies to other "non-sine waves" such as a "rectangular wave" described later).
The "predetermined stripe pattern" also includes "a stripe pattern having a sinusoidal light intensity distribution". That is, the image data may be configured to obtain image data having a light intensity distribution closer to an ideal sine wave by projecting a fringe pattern having a non-ideal "sine wave" and a light intensity distribution close to a sine wave.
According to the present invention, even when a stripe pattern is projected in a focused state, image data having a sinusoidal light intensity distribution can be acquired. By being able to project the stripe pattern in a focused state, the light intensity distribution (waveform) of the stripe pattern is easily maintained. Moreover, the projection of the fringe pattern can be performed using a telecentric optical system.
As a result, when three-dimensional measurement is performed by the phase shift method, the measurement accuracy can be significantly improved.
The movement operation of the grating in the "movement processing" may be a continuous operation in which the grating is continuously moved, or may be an intermittent operation in which the grating is intermittently moved (by a predetermined amount).
Note that the "imaging process in which continuous imaging is performed (or imaging is performed in a plurality of times) for a predetermined period at least partially overlapping with the moving period of the raster" is executed, and includes a case in which the imaging process is started in a stopped state before the movement of the raster is started, a case in which the imaging process is ended in a stopped state after the movement of the raster is stopped, and the like. Therefore, for example, the imaging process may be started after the imaging process is started in a state where the raster is stopped, and the imaging process may be ended after the movement of the raster is stopped.
The three-dimensional measurement apparatus according to claim 2 is the one according to claim 1, wherein the imaging process is started simultaneously with or during the movement process of the grating, and the imaging process is ended simultaneously with or during the stop of the movement process of the grating.
According to the above-described technical means 2, the position (phase) of the fringe pattern captured in a predetermined period is constantly changed. Thus, image data having a light intensity distribution closer to an ideal sine wave can be acquired as compared with the case where data of a stripe pattern in which a part is not moved is included. As a result, the measurement accuracy can be further improved.
Claim 3 the three-dimensional measuring apparatus according to claim 1 or 2, wherein the predetermined fringe pattern is a fringe pattern having a light intensity distribution of a non-sinusoidal wave.
The "non-sinusoidal wave" refers to a predetermined wave other than a "sinusoidal wave" such as a "rectangular wave", a "trapezoidal wave", a "triangular wave", or a "sawtooth wave", for example.
In general, the measurement accuracy is better in performing three-dimensional measurement on a fringe pattern having a light intensity distribution with a sinusoidal wave shape than in performing three-dimensional measurement by projecting a fringe pattern having a light intensity distribution with a non-sinusoidal wave shape (for example, a rectangular wave shape).
However, as described above, it is very difficult to generate a stripe pattern having a sinusoidal light intensity distribution by the projection unit, and there is a possibility that the mechanical configuration becomes complicated.
In this respect, according to claim 3, without complicating the mechanical configuration of the projection unit, the fringe pattern having the light intensity distribution of a non-sinusoidal wave (for example, a rectangular wave) other than the sinusoidal wave is projected, and the image data having the light intensity distribution of the sinusoidal wave can be acquired by relatively simple control processing, arithmetic processing, and the like. As a result, the complexity of the machine structure can be suppressed, and the manufacturing cost can be suppressed.
The three-dimensional measuring apparatus according to claim 4 is the three-dimensional measuring apparatus according to any one of claims 1 to 3, wherein the grating is configured such that a light transmitting portion through which light is transmitted and a light shielding portion which shields the light are alternately arranged.
According to claim 4, the same effects as those of claim 3 are obtained. By using the two-valued grating as in the present embodiment, it is possible to project a fringe pattern having a light intensity distribution with at least a flat peak portion (hereinafter referred to as "bright portion") with a maximum and constant luminance and a flat peak portion (hereinafter referred to as "dark portion") with a minimum and constant luminance. That is, a fringe pattern having a light intensity distribution of a rectangular wave or a trapezoidal wave can be projected.
In general, light passing through the grating is not perfectly parallel light, and a halftone area is generated at a boundary portion between a "light portion" and a "dark portion" of a stripe pattern due to diffraction action or the like at a boundary portion between a light transmitting portion and a light shielding portion, and thus, the light does not become a perfect rectangular wave.
Here, although the configuration differs depending on the arrangement interval of the light transmitting portion and the light shielding portion in the grating, when the luminance gradient of the halftone area at the boundary portion between the "light portion" and the "dark portion" is steep, the stripe pattern having a rectangular wave-shaped light intensity distribution is formed, and when the luminance gradient of the halftone area is gentle, the stripe pattern having a trapezoidal wave-shaped light intensity distribution is formed.
The three-dimensional measuring apparatus according to claim 5 is the three-dimensional measuring apparatus according to any one of claims 1 to 4, wherein the object to be measured is a printed board on which cream solder is printed or a wafer board on which solder bumps are formed.
According to claim 5, the height of the cream solder printed on the printed circuit board or the solder bump formed on the wafer substrate can be measured. Further, in the inspection of the cream solder or the solder bump, the quality judgment of the cream solder or the solder bump can be made based on the measured value thereof. Therefore, the above-described inspection has the effects of the respective technical means, and the quality can be determined with high accuracy. As a result, the inspection accuracy of the solder print inspection apparatus or the solder bump inspection apparatus can be improved.
Drawings
Fig. 1 is a schematic configuration diagram schematically showing a substrate inspection apparatus.
Fig. 2 is a block diagram showing an electrical configuration of the substrate inspection apparatus.
Fig. 3 is a diagram schematically showing the form of a stripe pattern projected onto a printed board.
Fig. 4 is a timing chart for explaining the processing operation of the camera and the illumination device.
Fig. 5 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X1 to X8) every lapse of a predetermined time in the first simulation.
Fig. 6 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X9 to X16) every lapse of a predetermined time in the first simulation.
Fig. 7 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X17 to X24) every lapse of a predetermined time in the first simulation.
Fig. 8 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X25 to X32) every lapse of a predetermined time in the first simulation.
Fig. 9 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X33 to X36) every lapse of a predetermined time in the first simulation.
Fig. 10 is a table relating to the first simulation, where (a) is a table showing a light intensity distribution of an ideal sine wave in the X-axis direction (coordinates X1 to X10) of the imaging element, (b) is a table showing various average values of luminance values in each pixel, and (c) is a table showing differences between the ideal value and the various average values.
Fig. 11 is a table relating to the first simulation, where (a) is a table showing a light intensity distribution of an ideal sine wave in the X-axis direction (coordinates X11 to X20) of the imaging element, (b) is a table showing various average values of luminance values in each pixel, and (c) is a table showing differences between the ideal value and the various average values.
Fig. 12 is a table relating to the first simulation, where (a) is a table showing a light intensity distribution of an ideal sine wave in the X-axis direction (coordinates X21 to X30) of the imaging element, (b) is a table showing various average values of luminance values in each pixel, and (c) is a table showing differences between the ideal value and the various average values.
Fig. 13 is a table relating to the first simulation, where (a) is a table showing a light intensity distribution of an ideal sine wave in the X-axis direction (coordinates X31 to X36) of the imaging element, (b) is a table showing various average values of luminance values in each pixel, and (c) is a table showing differences between the ideal value and the various average values.
Fig. 14 is a graph showing a light intensity distribution of a fringe pattern according to the first simulation.
Fig. 15 is a graph showing an ideal sinusoidal light intensity distribution shown in fig. 10 to 13 (a).
Fig. 16 is a graph plotting various average values shown in fig. 10 to 13 (b).
Fig. 17 is a graph plotting the difference between the various average values and ideal values shown in fig. 10 to 13 (c).
Fig. 18 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X1 to X8) every lapse of a predetermined time in the second simulation.
Fig. 19 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X9 to X16) every lapse of a predetermined time in the second simulation.
Fig. 20 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X17 to X24) every lapse of a predetermined time in the second simulation.
Fig. 21 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X25 to X32) every lapse of a predetermined time in the second simulation.
Fig. 22 is a table showing light intensity distributions of the photographing element in the X-axis direction (coordinates X33 to X36) every lapse of a predetermined time in the second simulation.
Fig. 23 is a table relating to the second simulation, where (a) is a table showing a light intensity distribution of an ideal sine wave in the X-axis direction (coordinates X1 to X10) of the imaging element, (b) is a table showing various average values of luminance values in each pixel, and (c) is a table showing differences between the ideal value and the various average values.
Fig. 24 is a table relating to the second simulation, where (a) is a table showing a light intensity distribution of an ideal sine wave in the X-axis direction (coordinates X11 to X20) of the imaging element, (b) is a table showing various average values of luminance values in each pixel, and (c) is a table showing differences between the ideal value and the various average values.
Fig. 25 is a table relating to the second simulation, where (a) is a table showing the light intensity distribution of an ideal sine wave in the X-axis direction (coordinates X21 to X30) of the imaging element, (b) is a table showing various average values of luminance values in each pixel, and (c) is a table showing the difference between the ideal value and the various average values.
Fig. 26 is a table relating to the second simulation, where (a) is a table showing the light intensity distribution of an ideal sine wave in the X-axis direction (coordinates X31 to X36) of the imaging element, (b) is a table showing various average values of luminance values in each pixel, and (c) is a table showing the difference between the ideal value and the various average values.
Fig. 27 is a graph showing a light intensity distribution of a fringe pattern according to the second simulation.
Fig. 28 is a graph showing the light intensity distribution of the ideal sine wave shown in fig. 23 to 26 (a).
Fig. 29 is a graph plotting various average values shown in (b) of fig. 23 to 26.
Fig. 30 is a graph plotting the difference between the various average values and ideal values shown in fig. 23 to 26 (c).
Fig. 31(a) to (d) are timing charts for explaining processing operations of the camera and the illumination device in the other embodiment.
Detailed Description
Hereinafter, one embodiment will be described with reference to the drawings. Fig. 1 is a schematic configuration diagram schematically showing a substrate inspection apparatus 1 including a three-dimensional measuring apparatus according to the present embodiment. As shown in the figure, the substrate inspection apparatus 1 includes: a mounting table 3 on which a printed circuit board 2 as a measurement target is mounted, the printed circuit board 2 having a cream solder K (see fig. 3) as a measurement target printed thereon; an illumination device 4 as a projection unit that projects a predetermined stripe pattern (stripe-shaped light pattern) from the surface of the printed substrate 2 obliquely upward; a camera 5 as an imaging unit for imaging a portion on the printed substrate 2 on which the stripe pattern is projected; and a control device 6 for performing various controls, image processing, and arithmetic processing in the substrate inspection apparatus 1, such as drive control of the illumination device 4 and the camera 5. The control device 6 constitutes an image acquisition unit and an image processing unit in the present embodiment.
Motors 15 and 16 are provided on the placement stage 3, and the motors 15 and 16 are driven and controlled by the control device 6, whereby the printed circuit board 2 placed on the placement stage 3 is slid in any direction (X-axis direction and Y-axis direction).
The illumination device 4 includes a light source 4a that emits predetermined light and a grating plate 4b that converts the light from the light source 4a into a fringe pattern, and is driven and controlled by a control device 6. Here, the light emitted from the light source 4a is guided to a condenser lens (not shown) and becomes parallel light there, and then is guided to a projection lens (not shown) via a grating plate 4b, and is projected as a stripe pattern on the printed substrate 2.
Further, a telecentric optical system may be used as the optical system of the illumination device 4 such as a condenser lens or a projection lens. The height position of the printed circuit board 2 may slightly change when the printed circuit board moves in the inspection area. If a telecentric optical system is used, the measurement can be performed with high accuracy without being affected by such a change.
The grating plate 4b is configured by an arrangement in which a straight light transmitting portion that transmits light and a straight light shielding portion that shields light are alternately arranged in a predetermined direction orthogonal to the optical axis of the light source 4 a. This makes it possible to project a stripe pattern having a light intensity distribution in a rectangular wave shape or a trapezoidal wave shape on the printed circuit board 2. As shown in fig. 3, in the present embodiment, a stripe pattern in which the direction of the projected stripe is orthogonal to the X-axis direction and parallel to the Y-axis direction is projected.
In general, the light passing through the grating plate 4b is not perfectly parallel light, and a halftone area is generated at the boundary between the "light portion" and the "dark portion" of the stripe pattern due to diffraction action or the like at the boundary between the light transmitting portion and the light shielding portion, and thus the light does not become a perfect rectangular wave. However, in fig. 3, the intermediate gradation region is omitted for simplicity, and the stripe pattern is illustrated as a two-value light and dark stripe pattern.
Here, the arrangement interval of the light transmitting portion and the light shielding portion in the grating plate 4b differs, and when the luminance gradient of the intermediate gradation region at the boundary portion between the "light portion" and the "dark portion" is steep, the stripe pattern having a light intensity distribution of a rectangular wave shape is formed (see fig. 14), and when the luminance gradient of the intermediate gradation region is gentle, the stripe pattern having a light intensity distribution of a trapezoidal wave shape is formed (see fig. 27).
Further, the illumination device 4 includes a driving means (not shown) such as a motor for moving the grating plate 4 b. The control device 6 can perform a moving process of continuously moving the grating plate 4b at a constant speed in the predetermined direction orthogonal to the optical axis of the light source 4a by performing drive control on the driving unit. This allows the projection onto the printed circuit board 2 to be performed such that the stripe pattern moves down in the X-axis direction.
The camera 5 includes a lens, an imaging element, and the like. In the present embodiment, a CCD sensor is used as an imaging element. The imaging element of the present embodiment has a resolution of 512 pixels in the X-axis direction (horizontal direction) and a resolution of 480 pixels in the Y-axis direction (vertical direction), for example.
The camera 5 is driven and controlled by the control device 6. More specifically, the control device 6 performs the imaging process while synchronizing the timing of moving the grating plate 4b with the timing of capturing an image by the camera 5 based on a signal from an encoder (not shown) provided in a driving unit of the grating plate 4 b.
The image data captured by the camera 5 is converted into a digital signal inside the camera 5, and then is input to the control device 6 in the form of a digital signal and stored in an image data storage device 24, which will be described later. The control device 6 performs image processing, arithmetic processing, and the like, which will be described later, based on the image data.
Here, an electrical configuration of the control device 6 will be described. As shown in fig. 2, the control device 6 includes: a CPU and an input/output interface 21 (hereinafter referred to as "CPU or the like 21") that executes control of the entire substrate inspection apparatus 1, an input device 22 as "input means" including a keyboard, a mouse, a touch panel, and the like, a display device 23 as "display means" having a display screen of a CRT, a liquid crystal, or the like, an image data storage device 24 for storing image data or the like captured by the camera 5, an operation result storage device 25 for storing various operation results, a setting data storage device 26 for storing various information such as design data in advance, and the like. The devices 22 to 26 are electrically connected to a CPU 21.
Next, an inspection routine performed by the substrate inspection apparatus 1 for each inspection area of the printed circuit board 2 will be described in detail with reference to fig. 4. Fig. 4 is a timing chart for explaining the processing operation of the camera 5 and the illumination device 4.
This check routine is executed by the control device 6(CPU or the like 21). In the present embodiment, 4 sets of image data having different light intensity distributions are acquired by performing image acquisition processing 4 times for each examination region.
The controller 6 first drives and controls the motors 15 and 16 to move the printed circuit board 2, and aligns the field of view (imaging range) of the camera 5 with a predetermined inspection area on the printed circuit board 2. The inspection area is one area of the surface of the printed circuit board 2 divided in advance with the size of the field of view of the camera 5 as 1 unit.
Next, the controller 6 drives and controls the illuminator 4 to set the position of the grating plate 4b to a first initial setting position (for example, a position of phase "0 °"), and starts the first image acquisition process. In addition, the initial setting positions of the grating plate 4b are set to be different in the 4 times of image acquisition processing, respectively, and the phases of the fringe patterns in the initial setting positions are different by 90 ° (quarter pitch), respectively.
When the first image acquisition process is started, the control device 6 causes the light source 4a of the illumination device 4 to emit light at a predetermined timing M1, starts the projection of the fringe pattern, and starts the movement process of the grating plate 4 b. Thereby, the stripe pattern projected to the inspection area is continuously moved at a constant speed in the X-axis direction.
The control device 6 drives and controls the camera 5 to start the shooting process at a predetermined timing N1. However, in the present embodiment, the start time M1 of the movement process of the raster plate 4b and the start time N1 of the imaging process of the camera 5 are set to be the same.
When the photographing process is started, photographing (exposure) by the camera 5 is performed in a plurality of times during the execution thereof. More specifically, the printed substrate 2 is photographed every time the stripe pattern is moved by a predetermined amount Δ x (e.g., a distance corresponding to a phase of the stripe pattern of 10 °), that is, every time a predetermined time Δ t elapses. Here, the image data captured by the camera 5 every time the predetermined time Δ t elapses is transferred to the image data storage device 24 and stored at any time.
Then, the control device 6 ends the movement process of the lenticular plate 4b at a timing M2 after a predetermined time has elapsed from the timing M1, and ends the projection of the fringe pattern. Then, the control device 6 ends the shooting process of the camera 5 at a time N2 after a predetermined time has elapsed from the time N1. However, in the present embodiment, the end time M2 of the movement process of the raster plate 4b and the end time N2 of the imaging process of the camera 5 are set to be the same.
When the shooting process by the camera 5 ends, the control device 6 executes a predetermined arithmetic process based on the shooting result obtained by the shooting process. More specifically, an averaging process is performed in which the luminance values of the respective pixels of a series of image data (a plurality of image data captured each time the stripe pattern is moved by a predetermined amount Δ x) captured in the capturing process are added for each pixel and the average value thereof is calculated. Thereby, image data having a light intensity distribution of a sine wave shape is acquired.
Then, the control device 6 stores the image data obtained by the averaging process in the calculation result storage device 25, and ends the first image obtaining process.
On the other hand, the control device 6 drives and controls the illumination device 4 after the first image acquisition process is completed or during the execution of the averaging process related to the first image acquisition process, and sets the position of the grating plate 4b at a second initial setting position (for example, a position where the phase of the fringe pattern is shifted by a quarter pitch by "90 °", from the first initial setting position).
After that, the control device 6 starts the image acquisition processing of the second time. The order of the second image acquisition process is the same as that of the first image acquisition process, and thus detailed description thereof is omitted (the same applies to the third and fourth image acquisition processes).
When image data having a sinusoidal light intensity distribution is acquired by the second image acquisition process, the control device 6 stores the acquired image data in the calculation result storage device 25, and ends the second image acquisition process.
The control device 6 drives and controls the illumination device 4 after the second image acquisition process is completed or during the execution of the averaging process related to the second image acquisition process, sets the position of the grating plate 4b at a third initial setting position (for example, a position where the phase of the fringe pattern is shifted by a quarter pitch by "180 °") and starts the third image acquisition process.
When image data having a sinusoidal light intensity distribution is acquired by the third image acquisition process, the control device 6 stores the acquired image data in the calculation result storage device 25, and ends the third image acquisition process.
The control device 6 drives and controls the illumination device 4 after the third image acquisition process is completed or during the execution of the averaging process related to the third image acquisition process, sets the position of the grating plate 4b at a fourth initial setting position (for example, a position where the phase of the fringe pattern is shifted by a quarter pitch from the third initial setting position by "270 °"), and starts the fourth image acquisition process.
When the image data having the light intensity distribution of the sine wave is acquired by the fourth image acquisition process, the control device 6 stores the acquired image data in the operation result storage device 25, and ends the fourth image acquisition process.
By performing the four image acquisition processes in this way, four sets of image data having different light intensity distributions can be acquired. This makes it possible to acquire the same image data as four sets of image data obtained by changing the phase of the stripe pattern having a sinusoidal light intensity distribution by 90 ° each.
Next, the control device 6 performs three-dimensional measurement (height measurement) by a known phase shift method also described in the background art based on the four sets of image data (luminance values of the respective pixels) acquired as described above, and stores the measurement results in the calculation result storage device 25.
Next, the control device 6 performs a quality determination process of the cream solder K based on the three-dimensional measurement result (height data in each coordinate). Specifically, the control device 6 detects the printing range of the cream solder K higher than the reference surface based on the measurement result of the inspection area obtained as described above, and calculates the amount of the printed cream solder K by integrating the heights of the respective portions within the range.
Next, the control device 6 compares the data such as the position, area, height, and amount of the cream solder K thus obtained with reference data (gerber data or the like) stored in advance in the setting data storage device 26, and determines whether or not the print state of the cream solder K in the inspection area is acceptable, based on whether or not the comparison result is within an allowable range.
While such processing is being performed, the control device 6 drives and controls the motors 15 and 16 to move the printed circuit board 2 to the next inspection area, and the series of processing described above is repeated in all the inspection areas, thereby ending the inspection of the entire printed circuit board 2.
Next, the results of verifying the operation and effect of the substrate inspection apparatus 1 according to the present embodiment by simulation are shown. First, the results of simulation (first simulation) in the case of projecting a stripe pattern having a rectangular wave-shaped light intensity distribution will be described with reference to fig. 5 to 17.
In this simulation, a stripe pattern having a light intensity distribution in a rectangular wave shape in which a 2-pixel-sized halftone area (luminance gradient) is present at the boundary between a "light portion" and a "dark portion" is projected with a 36-pixel size of the imaging element in the X-axis direction as one cycle, and the stripe pattern is shifted by 1 pixel size (phase of the stripe pattern is 10 ° size) in the X-axis direction each time a predetermined time Δ t elapses.
Fig. 5 to 9 are tables showing the relationship between the coordinate positions of the pixels in the X-axis direction of the imaging element (horizontal axis: coordinates X1 to X36) and the luminance values of the stripe patterns that change with the passage of time (vertical axis: time t1 to t 36). That is, it is a table showing the light intensity distribution of the imaging element in the X-axis direction every time a predetermined time elapses. However, the simulation was performed assuming that the luminance value of the "bright portion" having the maximum luminance is "1", and the luminance value of the "dark portion" having the minimum luminance is "0".
Fig. 5 to 9 show only one period size of the stripe pattern (36 pixel size in the X axis direction), but actually, a plurality of periods of the stripe pattern are continuously present in the X axis direction. That is, the light intensity distribution represented by the range of coordinates X1 to X36 repeatedly exists.
As shown in fig. 5 to 9, at the shooting time t1, the range of coordinates X2 to X17 is the "bright portion" of the luminance value "1", and the range of coordinates X20 to X35 is the "dark portion" of the luminance value "0". Further, at coordinates X36 and X1 and coordinates X18 and X19 corresponding to the boundary portion between the "bright portion" and the "dark portion", there are intermediate gray scale regions of 2 pixels in size in which the luminance value is gradually changed. That is, the light intensity distribution of the stripe pattern at the imaging time t1 is as shown by the graph in fig. 14.
At the shooting time t2 when the predetermined time Δ t has elapsed from the shooting time t1, the range of coordinates X3 to X18 is the "bright portion" of the luminance value "1", and the range of coordinates X21 to X36 is the "dark portion" of the luminance value "0". Further, at the shooting time t3 when the predetermined time Δ t has elapsed from the shooting time t2, the range of coordinates X4 to X19 is a "bright portion" of the luminance value "1", and the range of coordinates X22 to X1 is a "dark portion" of the luminance value "0".
In this way, the light intensity distribution of the stripe pattern is shifted by 1 pixel per predetermined time Δ t in the right direction of fig. 5 to 9.
Next, comparison with a stripe pattern having an ideal light intensity distribution of a sine wave was performed and verified. Fig. 10 to 13 (a) are tables showing the relationship between the coordinate positions (coordinates X1 to X36) of the pixels of the imaging element in the X-axis direction and the light intensity distribution (ideal value) of an ideal sine wave. Here, an ideal sinusoidal light intensity distribution having the same cycle, amplitude, and phase as the fringe pattern having the rectangular wave light intensity distribution at the imaging time t1 is shown. The ideal sine wave at the shooting time t1 is as shown by the curve in fig. 15.
Fig. 10 to 13 (b) are tables each showing the result (average value) of the averaging process performed on a plurality of pieces of image data (luminance values of respective pixels) captured within a predetermined time period before and after with the image data captured at the capturing time t1 as the center, for the coordinate position (horizontal axis: coordinates X1 to X36) of each pixel in the X-axis direction of the imaging element.
More specifically, in fig. 10 to 13 (b), the uppermost layer directly shows image data (luminance value of each pixel) captured at the capturing time t1 without performing averaging processing as a comparative example.
The second layer from the top shows a 3-equal average value obtained by averaging 1 image data (luminance values of respective pixels) captured around the capturing time t1, that is, 3 image data captured from the capturing times t36 to t 2.
The third layer from the top shows 5-equal-division average values averaged over 2 pieces of image data (luminance values of respective pixels) captured before and after the imaging time t1, that is, 5 pieces of image data captured at imaging times t35 to t 3.
The fourth layer from the top shows a 7-fold average value obtained by averaging 3 pieces of image data (luminance values of respective pixels) captured before and after the imaging time t1, i.e., 7 pieces of image data captured at imaging times t34 to t 4.
The fifth layer from the top shows the 9-divided average value obtained by averaging 4 pieces of image data (luminance values of pixels) captured around the capturing time t1, that is, 9 pieces of image data captured at the capturing times t33 to t 5.
On the sixth layer from the top, 11-divided average values averaged over 5 pieces of image data (luminance values of respective pixels) captured around the capturing time t1, that is, 11 pieces of image data captured at the capturing times t32 to t6 are shown.
On the seventh layer from the top, 13-equal-division average values averaged over 6 pieces of image data (luminance values of respective pixels) photographed before and after the photographing time t1, that is, 13 pieces of image data photographed at photographing times t31 to t7 are shown.
When the average values shown in fig. 10 to 13 (b) are plotted, the curves shown in fig. 16 are obtained.
Fig. 10 to 13(c) are tables showing differences between the ideal values shown in fig. 10 to 13 (a) and the average values shown in fig. 10 to 13 (b) with respect to the coordinate positions of the pixels of the imaging element in the X axis direction (horizontal axis: coordinates X1 to X36).
More specifically, fig. 10 to 13(c) show, as a comparative example, in the uppermost layer, differences between image data (luminance values of respective pixels) captured at the capturing time t1 and respective ideal values without performing averaging processing.
The difference between each of the above 3-fold average values and each of the ideal values is shown in the second layer from the top. The difference between each of the 5-fold average values and each of the ideal values described above is shown in the third layer from the top. The difference between each of the above 7-fold average values and each of the ideal values is shown in the fourth layer from the top. The difference between each of the above 9-fold average values and each of the ideal values is shown on the fifth layer from the top. The difference between the above 11-fold average values and the ideal values is shown in the sixth layer from the top. The difference between the above-described 13-fold average values and the ideal values is shown in the seventh layer from the top.
When the values shown in fig. 10 to 13(c) are plotted, the curves shown in fig. 17 are obtained. In addition, the average of the above-described average values and the maximum value of each average value shown for each pixel (coordinates X1 to X36) of the imaging element in the X axis direction are shown at the right end of fig. 13 (c).
As can be seen from the right end of fig. 13 c and fig. 16 and 17, the error from the ideal sine wave (ideal value) gradually decreases as the average number increases, and the error from the 13-equi-averaged value becomes the smallest, as the 5-equi-averaged value is compared with the 3-equi-averaged value and the 7-equi-averaged value is compared with the 5-equi-averaged value. Therefore, in the present simulation, it is more preferable to perform three-dimensional measurement by the phase shift method using the 13-aliquot average value.
Next, the results of simulation (second simulation) in the case of projecting a stripe pattern having a trapezoidal wave-shaped light intensity distribution will be described with reference to fig. 18 to 30.
In this simulation, a stripe pattern having a light intensity distribution with a trapezoidal waveform, in which a 12-pixel-sized halftone area (luminance gradient) exists at the boundary between a "light portion" and a "dark portion", is projected with a 36-pixel size of the imaging element in the X-axis direction as one cycle, and the stripe pattern is shifted by 1 pixel size (phase of the stripe pattern is 10 ° size) in the X-axis direction each time a predetermined time Δ t elapses.
Fig. 18 to 22 are tables showing the relationship between the coordinate position of each pixel in the X-axis direction of the imaging element (horizontal axis: coordinates X1 to X36) and the luminance value of the stripe pattern that changes with the passage of time (vertical axis: time t1 to t 36). That is, it is a table showing the light intensity distribution of the imaging element in the X-axis direction every time a predetermined time elapses. However, a simulation was performed assuming that the luminance value of the "bright portion" having the largest luminance is "1" and the luminance value of the "dark portion" having the smallest luminance is "0".
Fig. 18 to 22 show only one period size of the stripe pattern (36 pixel size in the X axis direction), but actually, a plurality of periods of the stripe pattern are continuously present in the X axis direction. That is, the light intensity distribution represented by the range of coordinates X1 to X36 repeatedly exists.
As shown in fig. 18 to 22, at the shooting time t1, the range of coordinates X7 to X12 is the "bright portion" of the luminance value "1", and the range of coordinates X25 to X30 is the "dark portion" of the luminance value "0". Further, at coordinates X31 to X6 and coordinates X13 to X24 corresponding to the boundary portion between the "bright portion" and the "dark portion", there is an intermediate gray scale region of 12 pixels in size in which the luminance value is gradually changed. That is, the light intensity distribution of the stripe pattern at the imaging time t1 is as shown by the curve in fig. 27.
At the shooting time t2 when the predetermined time Δ t has elapsed from the shooting time t1, the range of coordinates X8 to X13 is the "bright portion" of the luminance value "1", and the range of coordinates X26 to X31 is the "dark portion" of the luminance value "0". At the shooting time t3 when the predetermined time Δ t has elapsed from the shooting time t2, the range of coordinates X9 to X14 is the "bright portion" of the luminance value "1", and the range of coordinates X27 to X32 is the "dark portion" of the luminance value "0".
In this way, the light intensity distribution of the stripe pattern is shifted by 1 pixel size in the right direction of fig. 18 to 22 every time the predetermined time Δ t elapses.
Next, comparison with a stripe pattern having an ideal light intensity distribution of a sine wave was performed and verified. Fig. 23 to 26 (a) are tables showing the relationship between the coordinate positions (coordinates X1 to X36) of the pixels of the imaging element in the X-axis direction and the light intensity distribution (ideal value) of an ideal sine wave. Here, an ideal sinusoidal light intensity distribution having the same cycle, amplitude, and phase as the stripe pattern having the trapezoidal wave light intensity distribution at the imaging time t1 is shown. The ideal sine wave at the shooting time t1 is as shown by the curve in fig. 28.
Fig. 23 to 26 (b) are tables each showing the result (average value) of the averaging process performed on a plurality of pieces of image data (luminance values of respective pixels) captured within a predetermined time before and after the image data captured at the capturing time t1 with respect to the coordinate position (horizontal axis: coordinates X1 to X36) of each pixel in the X-axis direction of the imaging element.
More specifically, fig. 23 to 26 (b) show, as a comparative example, image data (luminance values of respective pixels) captured at the capturing time t1 without performing averaging processing as it is at the uppermost layer.
The second layer from the top shows a 3-equal average value obtained by averaging 1 image data (luminance values of pixels) captured before and after the imaging time t1, i.e., 3 image data captured at the imaging times t36 to t 2.
The third layer from the top shows 5-equal-division average values averaged over 2 pieces of image data (luminance values of respective pixels) captured before and after the imaging time t1, that is, 5 pieces of image data captured at imaging times t35 to t 3.
The fourth layer from the top shows a 7-fold average value obtained by averaging 7 pieces of image data (luminance values of respective pixels) captured 3 times before and after the imaging time t1, i.e., 7 times from the imaging time t34 to t 4.
The fifth layer from the top shows the 9-divided average value obtained by averaging 4 pieces of image data (luminance values of pixels) captured around the capturing time t1, that is, 9 pieces of image data captured at the capturing times t33 to t 5.
When the average values shown in fig. 23 to 26 (b) are plotted, the curves shown in fig. 29 are obtained.
Fig. 23 to 26(c) are tables showing differences between the ideal values shown in fig. 23 to 26 (a) and the average values shown in fig. 23 to 26 (b) with respect to the coordinate positions of the pixels of the imaging element in the X-axis direction (horizontal axis: coordinates X1 to X36).
More specifically, fig. 23 to 26(c) show, as a comparative example, in the uppermost layer, differences between image data (luminance values of respective pixels) captured at the capturing time t1 and respective ideal values without performing averaging processing.
The difference between each of the above 3-fold average values and each of the ideal values is shown in the second layer from the top. The difference between each of the 5-fold average values and each of the ideal values described above is shown in the third layer from the top. The difference between each of the above 7-fold average values and each of the ideal values is shown in the fourth layer from the top. The difference between each of the above 9-fold average values and each of the ideal values is shown on the fifth layer from the top.
When the above values shown in fig. 23 to 26(c) are plotted, the curve shown in fig. 30 is obtained. In addition, the average of the above-described average values and the maximum value of each average value shown for each pixel (coordinates X1 to X36) of the imaging element in the X axis direction are shown at the right end of fig. 26 (c).
As can be seen from the right end of fig. 26 c and fig. 29 and 30, the 5-equal division average value is the smallest error from the ideal sine wave (ideal value). Therefore, in the present simulation, it is more preferable to perform three-dimensional measurement by the phase shift method using a 5-equally divided average value.
However, the other 3-point average value, 7-point average value, and 9-point average value are sufficiently close to the ideal sine wave because the difference between the ideal value and the 5-point average value in the present simulation is slightly larger than the ideal value.
As described above, according to the present embodiment, the fringe pattern having the light intensity distribution of the rectangular wave or the trapezoidal wave projected on the printed circuit board 2 is moved, the moved fringe pattern is captured a plurality of times, the luminance values of the respective pixels of the series of captured image data are added for each pixel, and the average value thereof is calculated.
Thus, after 1 image data out of a plurality of image data having different light intensity distributions required for three-dimensional measurement by the phase shift method is acquired, image data having a light intensity distribution closer to an ideal sine wave can be acquired as compared with a case where only a stripe pattern having a light intensity distribution of a rectangular wave or a trapezoidal wave is projected and captured.
In addition, according to the present embodiment, even if a stripe pattern is projected in a focused state, image data having a light intensity distribution in a sine wave shape can be acquired. Since the fringe pattern can be projected in a focused state, the light intensity distribution (waveform) of the fringe pattern can be easily maintained.
As a result, when three-dimensional measurement is performed by the phase shift method, the measurement accuracy can be significantly improved.
Further, according to the present embodiment, it is possible to project a stripe pattern having a light intensity distribution of a non-sinusoidal rectangular wave or a trapezoidal wave without complicating the mechanical configuration, and to acquire image data having a light intensity distribution of a sinusoidal wave by relatively simple control processing, arithmetic processing, or the like. As a result, the complexity of the machine structure can be suppressed, and the manufacturing cost can be suppressed.
The present invention is not limited to the description of the above embodiments, and can be implemented, for example, as follows. Of course, other application examples and modifications not illustrated below are also possible.
(a) In the above embodiment, the three-dimensional measuring device is embodied as the board inspection device 1 that measures the height of the cream solder K printed on the printed board 2, but is not limited to this, and may be embodied as a structure that measures the height of other components such as solder bumps printed on the board, electronic components mounted on the board, and the like.
(b) In the above-described embodiment, 4 sets of image data in which the initial phases of the fringe patterns differ by 90 ° are acquired when three-dimensional measurement is performed by the phase shift method, but the number of phase conversion times and the amount of phase conversion are not limited to this. Other numbers of phase transformations and amounts of phase transformations that enable three-dimensional measurements by phase-shift methods may also be used.
For example, the three-dimensional measurement may be performed by acquiring 3 sets of image data whose phases are different by 120 ° (or 90 °), or may be performed by acquiring 2 sets of image data whose phases are different by 180 ° (or 90 °).
(c) In the above-described embodiment, the stripe pattern having a rectangular wave-like or trapezoidal wave-like light intensity distribution is projected, and image data having a sinusoidal wave-like light intensity distribution is acquired.
Not limited to this, for example, a stripe pattern having a light intensity distribution of other non-sinusoidal wave such as a triangular wave or a saw-shaped wave may be projected to acquire image data having a light intensity distribution of sinusoidal wave. Of course, if possible, it may be configured to project a stripe pattern having a light intensity distribution of a rectangular wave shape in which no intermediate gradation region (luminance slope) exists, and acquire image data having a light intensity distribution of a sinusoidal wave shape.
Further, the image data may be configured to acquire image data having a light intensity distribution closer to an ideal sine wave by projecting a fringe pattern having a light intensity distribution that is not an ideal sine wave but is similar to a sine wave (sine wave).
(d) The configuration of the projection unit is not limited to the illumination device 4 according to the above embodiment.
For example, in the above embodiment, the grating plate 4b is used as a grating for converting light from the light source 4a into a stripe pattern.
Not limited to this, for example, a liquid crystal panel may be used as the grating. The liquid crystal panel has a liquid crystal layer formed between a pair of transparent substrates, a common electrode disposed on one of the transparent substrates, and a plurality of strip-shaped electrodes arranged in parallel on the other transparent substrate so as to face the common electrode, and switches the light transmittances of the respective grating lines corresponding to the respective strip-shaped electrodes by controlling the on/off of switching elements (thin film transistors and the like) connected to the respective strip-shaped electrodes by a drive circuit and controlling the voltages applied to the respective strip-shaped electrodes, thereby forming a grating pattern in which light transmitting portions having high light transmittances and light shielding portions having low light transmittances are alternately arranged. By switching and controlling the positions of the light transmitting portion and the light shielding portion, the grating can be moved.
Instead of the liquid crystal panel, DLP (registered trademark) using a digital mirror device may be used as the grating.
(e) In the above embodiment, the grating (the grating plate 4b) having two values in which the light transmitting portion and the light shielding portion are alternately arranged is used, but the present invention is not limited to this, and for example, a multi-value grating pattern having different transmittances of 3 steps or more may be formed on the grating plate or the liquid crystal panel.
(f) In the above embodiment, the start time M1 of the movement process of the lenticular plate 4b and the start time N1 of the photographing process of the camera 5 are set to be the same, and the end time M2 of the movement process of the lenticular plate 4b and the end time N2 of the photographing process of the camera 5 are set to be the same.
Not limited to this, as shown in fig. 31 a, the imaging process by the camera 5 may be started after the movement of the lenticular plate 4b is started (start time M1) (start time N1), and the imaging process by the camera 5 may be ended before the movement of the lenticular plate 4b is stopped (end time M2) (end time N2).
As shown in fig. 31 b, the imaging process by the camera 5 may be started before the movement of the lenticular plate 4b is started (start time M1) (start time N1), and the imaging process by the camera 5 may be ended simultaneously with or before the movement of the lenticular plate 4b is stopped (end time M2) (end time N2).
As shown in fig. 31 c, the imaging process by the camera 5 may be started (start time N1) at the same time as or after the movement of the lenticular plate 4b is started (start time M1), and the imaging process by the camera 5 may be ended (end time N2) after the movement of the lenticular plate 4b is stopped (end time M2).
As shown in fig. 31 d, the imaging process by the camera 5 may be started before the movement of the lenticular plate 4b is started (start time M1) (start time N1), and the imaging process by the camera 5 may be ended after the movement of the lenticular plate 4b is stopped (end time M2) (end time N2).
(g) In the above-described embodiment, each image acquisition process is configured to perform a movement process of continuously moving the grating plate 4b at a constant speed by a driving unit such as a motor. The driving unit of the grating plate 4b is not limited to a member that continuously moves the grating plate 4b such as a motor, and may be a member that intermittently moves (moves by a predetermined amount) the grating plate 4b such as a piezoelectric element.
When the grating plate 4b is moved by the driving means such as the piezoelectric element, for example, one movement process may be performed by one intermittent movement operation, or a plurality of intermittent movement operations of a predetermined amount may be performed.
In the above embodiment, the grating plate 4b is configured to be stopped every time the image pickup processing is performed, but the grating plate 4b may be configured to continuously perform the moving operation while the image pickup processing is performed four times.
(h) In the above-described embodiment, in each image acquisition process, imaging (exposure) is performed a plurality of times while the raster plate 4b is moving, and the luminance values of each pixel of the series of captured image data are added for each pixel to calculate the average value thereof.
Not limited to this, the process of calculating the average value may be omitted, and three-dimensional measurement may be performed based on addition data (image data) in which the luminance values of the respective pixels of the series of image data are added to the respective pixels.
In each image acquisition process, imaging (exposure) may be continuously performed while the grating plate 4b is moving, and three-dimensional measurement may be performed based on the imaged image data.
In general, as the amount of light (light receiving amount) received by the image sensor increases, an image having a better image quality, which is more suitable for measurement, that is, an image having less influence of noise or quantization error can be obtained. However, if the shooting (exposure) time is long, the imaging element reaches a saturation level, and the image becomes so-called "overexposed". In contrast, as in the above-described embodiment, by repeating the imaging (exposure) a plurality of times while the grating plate 4b is moving, an image with a larger amount of received light can be obtained without saturation by adding the luminance value to each pixel.
On the other hand, as long as the imaging element does not reach the range of the saturation level, the processing load of continuously performing imaging (exposure) during the movement of the grating plate 4b is small.
(i) In the above embodiment, the CCD sensor is used as the image pickup device of the camera 5, but the image pickup device is not limited to this, and for example, a CMOS sensor or the like may be used.
In addition, since data transfer during exposure is not possible when a general CCD camera or the like is used, data transfer (reading) needs to be performed during a period when imaging (exposure) is performed a plurality of times while the grating plate 4b is moving as in the above-described embodiment.
In contrast, when a CMOS camera or a CCD camera having a function of exposing data during data transfer is used as the camera 5, shooting (exposure) and data transfer can be partially repeated, and thus the measurement time can be shortened.
Description of the symbols
1 … substrate inspection device, 2 … printed substrate, 4 … lighting device, 4a … light source, 4b … grating plate, 5 … camera, 6 … control device, 24 … image data storage unit.

Claims (5)

1. A three-dimensional measurement device, comprising:
a projection unit having a light source that emits predetermined light, a grating that converts the light from the light source into a predetermined fringe pattern, and a drive unit that can move the grating, and capable of projecting the fringe pattern onto an object to be measured;
an imaging unit that can image the measurement target on which the fringe pattern is projected;
an image acquisition unit that controls the projection unit and the photographing unit and can acquire a plurality of image data having different light intensity distributions; and
an image processing unit capable of performing three-dimensional measurement of the object by a phase shift method based on a plurality of image data acquired by the image acquisition unit,
the image acquisition unit
After acquiring one of the plurality of image data,
a shift process of shifting the grating is performed, and,
performing a photographing process of continuous photographing for a predetermined period at least partially overlapping with a moving period of the raster,
or,
the imaging processing for imaging a plurality of times is executed in a predetermined period at least partially overlapping with the movement period of the raster, and the processing for adding or averaging the imaging results for each pixel is executed.
2. The three-dimensional measuring device of claim 1,
the photographing process is started at the same time as or during the movement process of the raster, and the photographing process is ended at the same time as or during the movement process of the raster.
3. Three-dimensional measuring device according to claim 1 or 2,
the predetermined fringe pattern is a fringe pattern having a light intensity distribution that is not sinusoidal.
4. The three-dimensional measuring device according to any one of claims 1 to 3,
the grating is configured by alternately arranging a light transmission part for transmitting light and a light shielding part for shielding the light.
5. The three-dimensional measuring device of any one of claims 1 to 4,
the object to be measured is a printed substrate printed with cream solder or a wafer substrate formed with solder bumps.
CN201680046181.9A 2015-11-27 2016-07-08 Three-dimensional measuring device Active CN107923736B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2015-231661 2015-11-27
JP2015231661A JP6062523B1 (en) 2015-11-27 2015-11-27 3D measuring device
PCT/JP2016/070238 WO2017090268A1 (en) 2015-11-27 2016-07-08 Three-dimensional measurement device

Publications (2)

Publication Number Publication Date
CN107923736A true CN107923736A (en) 2018-04-17
CN107923736B CN107923736B (en) 2020-01-24

Family

ID=57800032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680046181.9A Active CN107923736B (en) 2015-11-27 2016-07-08 Three-dimensional measuring device

Country Status (6)

Country Link
JP (1) JP6062523B1 (en)
CN (1) CN107923736B (en)
DE (1) DE112016005425T5 (en)
MX (1) MX366402B (en)
TW (1) TWI610061B (en)
WO (1) WO2017090268A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114424020A (en) * 2019-10-28 2022-04-29 电装波动株式会社 3D measuring device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007113958A (en) * 2005-10-18 2007-05-10 Yamatake Corp 3D measuring apparatus, 3D measuring method, and 3D measuring program
CN101900534A (en) * 2009-05-27 2010-12-01 株式会社高永科技 3-d shape measurement equipment and method for measuring three-dimensional shape
CN103162641A (en) * 2011-12-15 2013-06-19 Ckd株式会社 Device for measuring three dimensional shape
JP2013167464A (en) * 2012-02-14 2013-08-29 Ckd Corp Three-dimensional measurement device
JP2015087244A (en) * 2013-10-30 2015-05-07 キヤノン株式会社 Image processor and image processing method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4701948B2 (en) * 2005-09-21 2011-06-15 オムロン株式会社 Pattern light irradiation device, three-dimensional shape measurement device, and pattern light irradiation method
CN100561258C (en) * 2008-02-01 2009-11-18 黑龙江科技学院 A phase-shift grating in a three-dimensional measurement system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007113958A (en) * 2005-10-18 2007-05-10 Yamatake Corp 3D measuring apparatus, 3D measuring method, and 3D measuring program
CN101900534A (en) * 2009-05-27 2010-12-01 株式会社高永科技 3-d shape measurement equipment and method for measuring three-dimensional shape
CN103162641A (en) * 2011-12-15 2013-06-19 Ckd株式会社 Device for measuring three dimensional shape
JP2013167464A (en) * 2012-02-14 2013-08-29 Ckd Corp Three-dimensional measurement device
JP2015087244A (en) * 2013-10-30 2015-05-07 キヤノン株式会社 Image processor and image processing method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114424020A (en) * 2019-10-28 2022-04-29 电装波动株式会社 3D measuring device

Also Published As

Publication number Publication date
JP6062523B1 (en) 2017-01-18
JP2017096866A (en) 2017-06-01
WO2017090268A1 (en) 2017-06-01
TWI610061B (en) 2018-01-01
MX366402B (en) 2019-07-08
CN107923736B (en) 2020-01-24
MX2018001490A (en) 2018-08-01
TW201719112A (en) 2017-06-01
DE112016005425T5 (en) 2018-08-16

Similar Documents

Publication Publication Date Title
CN108139208B (en) Three-dimensional measuring device
US10563977B2 (en) Three-dimensional measuring device
CN110291359B (en) 3D measuring device
US8896845B2 (en) Device for measuring three dimensional shape
CN107110643B (en) Three-dimensional measuring device
JP5847568B2 (en) 3D measuring device
JP2010169433A (en) Three-dimensional measuring device
WO2017010110A1 (en) Three-dimensional-measurement device
CN107532890B (en) Three-dimensional measuring device
CN107923736B (en) Three-dimensional measuring device
TWI606227B (en) Three-dimensional measuring device
TW201640076A (en) Three-dimensional measurement device and three-dimensional measurement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant