WO2024154528A1 - Three-dimensional measurement device - Google Patents
Three-dimensional measurement device Download PDFInfo
- Publication number
- WO2024154528A1 WO2024154528A1 PCT/JP2023/045741 JP2023045741W WO2024154528A1 WO 2024154528 A1 WO2024154528 A1 WO 2024154528A1 JP 2023045741 W JP2023045741 W JP 2023045741W WO 2024154528 A1 WO2024154528 A1 WO 2024154528A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- event data
- unit
- output
- time
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 83
- 238000003384 imaging method Methods 0.000 claims abstract description 53
- 238000000034 method Methods 0.000 claims abstract description 21
- 230000010363 phase shift Effects 0.000 claims abstract description 20
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 230000002207 retinal effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
Definitions
- This disclosure relates to a three-dimensional measuring device that measures the three-dimensional shape of a measurement object.
- phase shift method is a technique for performing three-dimensional measurement of a measurement object onto which multiple stripe pattern images are projected by projecting multiple stripe pattern images with shifted phases.
- a three-dimensional measuring device disclosed in Patent Document 1 below is known as a technology for performing three-dimensional measurement using the phase shift method in this way.
- This three-dimensional measuring device assigns stripes of each phase to light of a different wavelength, projects a stripe pattern image composed of these stripes onto the measurement object, and photographs the measurement object onto which this stripe pattern image is projected with a color camera. Then, each color component is extracted from the photographed image and the phase is calculated in a single photograph. This aims to reduce the time required to measure the three-dimensional shape.
- the event camera disclosed in Patent Document 2 is known as a technology for generating images of measurement objects at higher speeds.
- This event camera is a brightness value differential output camera that was developed inspired by the retinal structure of living organisms.
- the event camera detects changes in brightness for each pixel and outputs the pixel's coordinates, time, and the polarity of the brightness change as event data.
- the event camera has the characteristic of not outputting pixel information with no brightness changes, which is output by conventional cameras, in other words, redundant data. This reduces the amount of data communication and lightens the image processing load, making it possible to generate images of measurement objects at higher speeds.
- Patent No. 3723057 US Patent Application Publication No. 2016/0227135
- the captured image of the object to be measured which is generated using the event data output from the event camera
- it is possible to ascertain whether or not there is a change in brightness on a pixel-by-pixel basis from the captured image it is not possible to directly measure the brightness value.
- the present disclosure has been made to solve the above-mentioned problems, and its purpose is to provide a three-dimensional measurement device that can measure the three-dimensional shape of a measurement target using event data.
- a three-dimensional measuring device includes a projection unit that projects a predetermined stripe pattern onto a measurement target area, an imaging unit that images a measurement target placed in the measurement target area onto which the predetermined stripe pattern is projected, a measurement unit that measures the three-dimensional shape of the measurement target by a phase shift method using luminance information obtained from the image captured by the imaging unit, and a control unit that controls the projection unit,
- the imaging unit includes an imaging element that outputs event data including two-dimensional point data that identifies the position of a pixel that has experienced a luminance change when light is received, and generates the captured image from the event data output from the imaging element, the imaging element outputs positive event data in the case of a luminance change that becomes brighter, and outputs negative event data in the case of a luminance change that becomes darker
- the control unit outputs a projection start signal at the start timing of projection of the predetermined stripe pattern
- the measurement unit obtains the luminance information for each pixel in the captured image based on the time difference between
- the higher the luminance value the longer the time difference between the output time of the positive polarity event data (hereinafter also referred to as the occurrence time) and the occurrence time of the negative polarity event data.
- luminance information can be obtained based on the time difference between the occurrence time of the positive polarity event data and the occurrence time of the negative polarity event data in pixel units in the captured image.
- the occurrence time of the positive polarity event data coincides with the output time of the projection start signal, and the negative polarity event data is output after the output time of the projection start signal.
- luminance information can be obtained for each pixel based on the time difference between the output time of the projection start signal when the above-mentioned specified stripe pattern is projected and the occurrence time of the event data output after the output time of the projection start signal.
- the three-dimensional shape of the measurement object can be measured by the phase shift method. Therefore, it is possible to measure the three-dimensional shape of the measurement object using the event data.
- the above-mentioned three-dimensional measuring device does not obtain the occurrence times of both the positive polarity event data and the negative polarity event data, but only obtains the occurrence time of the event data output after the input of the projection start signal. This shortens the processing time and speeds up image generation of the measurement object. In other words, the three-dimensional shape of the measurement object can be measured more quickly.
- FIG. 1 is a block diagram showing a schematic configuration of a three-dimensional measuring apparatus according to a first embodiment.
- 1A and 1B are diagrams illustrating a state in which a stripe pattern for a general phase shift method is projected onto a measurement object.
- FIG. 1 is a diagram for explaining three-dimensional measurement by a phase shift method.
- 11 is a diagram illustrating the relationship between the time difference between the occurrence time of positive polarity event data and the occurrence time of negative polarity event data (i.e., ON time) and the luminance value (i.e., luminance information).
- 11A and 11B are diagrams illustrating the R, G, and B light emission states of a certain pixel when a stripe pattern is projected.
- FIG. 5B is a diagram illustrating the R-color light emission state, the G-color light emission state, and the B-color light emission state in a pixel different from the pixel in FIG. 5A.
- 11A and 11B are diagrams illustrating the relationship between the projection start timing, the projection end timing, the input timing of the projection start signal, the input timing of the projection end signal, and the imaging start timing and the imaging end timing.
- 11 is a diagram for explaining an example of creating a stripe pattern in which the horizontal axis indicates pixels and the vertical axis indicates ON time.
- 7B is a diagram illustrating the time difference between the occurrence time of negative polarity event data obtained for each pixel when the stripe pattern shown in FIG. 7A is captured and the input time of a projection start signal.
- FIG. 1 is a hardware configuration diagram of a three-dimensional measuring apparatus according to a first embodiment.
- FIG. 4 is a flowchart showing processing of the three-dimensional measuring apparatus according to the first embodiment.
- the three-dimensional measuring device 10 is a device that measures the three-dimensional shape of a measurement object 50.
- the device includes a control unit 11, a projection unit 20, an imaging unit 30, and a measurement unit 40.
- the control unit 11 is responsible for overall control.
- the projection unit 20 projects a predetermined stripe pattern for the phase shift method onto the measurement object 50.
- the imaging unit 30 captures the measurement object 50 onto which the predetermined stripe pattern is projected.
- the measurement unit 40 measures the three-dimensional shape of the measurement object 50 from the captured image.
- the three-dimensional measuring device 10 configured in this manner measures the three-dimensional shape of the measurement object 50, such as a workpiece, which is attached to the hand of a robot and moves relatively to the hand at high speed.
- the relative movement refers to the relative movement between the movement of the three-dimensional measuring device 10 attached to the hand of the robot and the movement of the measurement object 50.
- the relative movement is the movement of the measurement object 50 .
- FIG. 2 for convenience, a specified stripe pattern having up to 13 stripes is illustrated in a simplified form. More specifically, a typical stripe pattern is represented as a sine wave pattern, so the light and dark parts of the stripe pattern have the same width. However, in FIG. 2, for convenience, the width of the dark parts is reduced and they are represented by lines. Also, although the number of stripes is 13 or more in the embodiment, it is abbreviated to 13.
- the three-dimensional measuring device 10 has a processor 101 and a memory 103 as a hardware configuration.
- the three-dimensional measuring device 10 may have a microcomputer.
- the microcomputer may have a CPU (Central Processing Unit), a system bus, an input/output interface, a ROM (Read Only Memory), a RAM (Random Access Memory), a non-volatile memory, etc.
- the memory 103 pre-stores a program related to control of the projection unit 20 and a program for executing control processing using the three-dimensional measurement results by the measurement unit 40.
- the functions of the control unit 11 and the measurement unit 40 may be realized by the above hardware configuration.
- the projection unit 20 is a so-called DLP (registered trademark) projector.
- the projection unit 20 is controlled by the control unit 11 to project a predetermined stripe pattern, which will be described later, by reflecting light from a light source with a DMD (Digital Micromirror Device) element.
- the DMD element has fine mirrors corresponding to each pixel of the image projected on the screen, and the fine mirrors are arranged in an array.
- the DMD element changes the angle of each mirror to switch (i.e., turn ON/OFF) the light emitted to the screen in microsecond units. That is, the projection unit 20 functions to project a predetermined stripe pattern by controlling the ON/OFF reflection of incident light by the DMD, which has multiple mirrors arranged in an array, for each mirror by the control unit 11.
- control unit 11 changes the gradation (i.e., brightness) of the reflected light depending on the ratio of the time each mirror is ON to the time it is OFF. This makes it possible to display gradations based on the image data of the image to be projected.
- the projection unit 20 is equipped with mirrors corresponding to k ⁇ l pixels (for example, 1140 ⁇ 912). Also, for example, consider a case where R (red), G (green), and B (blue) colors are prepared as light incident on the DMD element.
- an R light emission state in which R light is emitted by reflecting on the mirror, a G light emission state in which G light is emitted by reflecting on the mirror, and a B light emission state in which B light is emitted by reflecting on the mirror are repeated in a short, predetermined cycle.
- a color image can be projected by individually adjusting the light emission time of each light emission state.
- the control unit 11 sets the ON/OFF timing of reflection within the unit time for each mirror according to the above-mentioned predetermined stripe pattern.
- the imaging unit 30 is a so-called event camera.
- the imaging unit 30 has an imaging element that outputs event data (specifically, two-dimensional point data, time, and polarity of brightness change) including two-dimensional point data that identifies the position of a pixel that has undergone a brightness change when light is received.
- the imaging unit 30 generates an image from the event data output from the imaging element. For this reason, the imaging unit 30 outputs event data of positive polarity (i.e., positive brightness change) when a brightness change occurs in which the pixel becomes brighter due to receiving light, and outputs event data of negative polarity (i.e., negative brightness change) when a brightness change occurs in which the pixel becomes darker due to the disappearance of the light.
- event data specifically, two-dimensional point data, time, and polarity of brightness change
- the imaging unit 30 generates an image from the event data output from the imaging element. For this reason, the imaging unit 30 outputs event data of positive polarity (i.e., positive brightness change)
- Image data of the measurement object 50 is generated by plotting the two-dimensional point data of multiple event data output within a certain period of time as points on a specified plane.
- the imaging unit 30 outputs the image data or event data (i.e., two-dimensional point data, time, and polarity of brightness change) generated in this way to the measurement unit 40.
- the measurement unit 40 is controlled by the control unit 11.
- the measurement unit 40 measures the three-dimensional shape of the measurement object 50 by the phase shift method based on an image captured by the imaging unit 30 of the measurement object 50 onto which a predetermined stripe pattern is projected from the projection unit 20.
- a phase value ⁇ corresponding to a distorted value according to the surface shape of a measurement object 50 is obtained based on a grating image (i.e., a stripe image) which is an image of the measurement object 50 onto which a predetermined stripe pattern (i.e., a pattern in which the luminance changes periodically in a first direction and the luminance does not change in a second direction perpendicular to the first direction) is projected.
- a sine wave pattern specified from the luminance value I(x, y, n) of the following formula (1) is adopted.
- the luminance values I(x, y, n) of N phase-shifted grating images are expressed by formula (1).
- I (x, y, n) a (x, y) cos ⁇ (x, y) + 2 ⁇ n/N ⁇ + b (x, y) ...
- point (x, y) is one point (i.e., one pixel) in the grid image.
- a(x, y) is the luminance amplitude.
- b(x, y) indicates background luminance.
- the distance to point (x, y) is measured according to the phase value ⁇ (x, y) calculated from the luminance values I(x, y, n) of the N grid images.
- the specified stripe pattern for the phase shift method is configured so that the phases of the sine wave pattern consisting of only R, the sine wave pattern consisting of only G, and the sine wave pattern consisting of only B are shifted by 2 ⁇ /3.
- the measurement unit 40 uses the above formula (1) to obtain the phase value ⁇ (x,y).
- the measurement unit 40 measures the distance to point (x,y) according to the phase value ⁇ (x,y) thus obtained.
- the measuring unit 40 determines the phase value ⁇ of point P1 and information on which stripe the point P1 is on (i.e., stripe number) from N captured images of the imaging unit 30 in a state where a predetermined stripe pattern is shifted and projected by the projection unit 20 N times. From the phase value ⁇ and stripe number thus determined, the angle ⁇ p1 at the projection unit 20 and the angle ⁇ c1 at the imaging unit 30 are determined. Since the distance between the projection unit 20 and the imaging unit 30 (i.e., parallax Dp) is known, the distance to point P1 can be determined by triangulation. Similarly, the distance to point P2 in FIG.
- 3 can be determined by triangulation based on the angle ⁇ p2 at the projection unit 20 and the angle ⁇ c2 at the imaging unit 30, which are determined from the phase value ⁇ of point P2 determined from the N captured images and the stripe number. By performing this calculation over the entire measurement area, three-dimensional measurement can be performed.
- an event camera is used as an imaging unit for accurately imaging the measurement target 50 that moves relatively at high speed.
- event data corresponding to pixels where a luminance change occurs is output.
- the luminance values i.e., I(x,y,0), I(x,y,1), I(x,y,2)
- the luminance value (i.e., luminance information) can be obtained based on the time difference between the occurrence time of positive polarity event data and the occurrence time of negative polarity event data at each pixel in the captured image (i.e., the ON time in FIG. 4). Note that in FIG. 4 and FIG. 5 described later, the output of positive polarity event data is indicated by an upward arrow, and the output of negative polarity event data is indicated by a downward arrow.
- the R light emission state, the G light emission state, and the B light emission state are repeated at a predetermined cycle of 3Tp (unit time Tp).
- positive polarity event data is generated and output at the timing when the R light emission starts (e.g., t11 in FIG. 5A)
- negative polarity event data is generated and output at the timing when the R light emission ends (e.g., t12 in FIG. 5A).
- positive polarity event data is generated and output at the timing when the G light emission starts (e.g., t13 in FIG.
- negative polarity event data is generated and output at the timing when the G light emission ends (e.g., t14 in FIG. 5A).
- positive polarity event data is generated and output at the timing when the B light emission starts (e.g., t15 in FIG. 5A)
- negative polarity event data is generated and output at the timing when the B light emission ends (e.g., t16 in FIG. 5A).
- positive polarity event data is generated and output at the start of R emission (e.g., t21 in FIG. 5B), and negative polarity event data is generated and output at the end of R emission (e.g., t22 in FIG. 5B).
- positive polarity event data is generated and output at the start of G emission (e.g., t23 in FIG. 5B), and negative polarity event data is generated and output at the end of G emission (e.g., t24 in FIG. 5B).
- positive polarity event data is generated and output at the start of B emission (e.g., t25 in FIG. 5B), and negative polarity event data is generated and output at the end of B emission (e.g., t26 in FIG. 5B).
- the R luminance value can be calculated based on the time from when the R light emission starts to when the R light emission ends.
- the G luminance value can be calculated based on the time from when the G light emission starts to when the G light emission ends.
- the B luminance value can be calculated based on the time from when the B light emission starts to when the B light emission ends.
- a luminance value i.e., luminance information
- luminance information based on the time difference between the occurrence time of positive polarity event data and the occurrence time of negative polarity event data at each pixel in the captured image.
- a luminance value I(x, y, 0) in the R-color light emission state based on t12-t11, which is the time difference between the occurrence time of positive polarity event data and the occurrence time of negative polarity event data in the R-color light emission state.
- the control unit 11 outputs a start light emission instruction signal to the projection unit 20 to start projecting a predetermined stripe pattern.
- the projection unit 20 upon receiving this start light emission instruction signal, starts projecting the predetermined stripe pattern.
- the control unit 11 outputs a projection start signal S1 synchronized with the start light emission instruction signal to the measurement unit 40 and the imaging unit 30.
- the control unit 11 also outputs an end light emission instruction signal to the projection unit 20 to end the projection of the predetermined stripe pattern.
- the projection unit 20, upon receiving this end light emission instruction signal ends the projection of the predetermined stripe pattern.
- the control unit 11 outputs a projection end signal S2 synchronized with the end light emission instruction signal to the measurement unit 40 and the imaging unit 30.
- the imaging unit 30 starts imaging when the projection start signal S1 is input from the control unit 11, and ends imaging when the projection end signal S2 is subsequently input. Therefore, in the imaging unit 30, positive polarity event data is output for all pixels when the projection start signal S1 is input. Thereafter, in the imaging unit 30, negative polarity event data is output later for brighter pixels before the projection end signal S2 is input. Note that the imaging unit 30 may end imaging after a specified time has elapsed from the input time of the projection start signal S1 (i.e., the output time of the projection start signal S1) without using the projection end signal S2.
- positive polarity event data is input from the imaging unit 30 for all pixels at the timing when the projection start signal S1 is input from the control unit 11.
- negative polarity event data is input from the imaging unit 30 at different timings for each pixel.
- the occurrence time of positive polarity event data coincides with the input time of the projection start signal S1, and negative polarity event data is output after the input time of the projection start signal S1. Therefore, it is possible to obtain brightness information for each pixel based on the time difference between the input time of the projection start signal S1 when the above-mentioned specified stripe pattern is projected and the occurrence time of the event data output after that input time.
- the measurement unit 40 to which the projection start signal S1 is input can obtain brightness information for each pixel based on the time difference between the input time of the projection start signal S1 and the occurrence time of the negative polarity event data output after that input time (i.e., the ON time in FIG. 7B), without obtaining the input time of the positive polarity event data from the imaging unit 30, as shown in FIG. 7B.
- the imaging unit 30, which images the measurement object 50 onto which a predetermined stripe pattern is projected from the projection unit 20, is equipped with an imaging element that outputs event data including two-dimensional point data that identifies the position of pixels that have experienced a luminance change when light is received, and generates an image from the event data output from the imaging element.
- This imaging element outputs positive event data in the case of a brighter luminance change, and outputs negative event data in the case of a darker luminance change.
- the control unit 11 outputs a projection start signal S1 to the measurement unit 40 at the start of projection of the predetermined stripe pattern, and the measurement unit 40 calculates luminance information for each pixel in the captured image based on the time difference between the input time of the projection start signal S1 from the control unit 11 and the occurrence time of the event data output after that input time.
- luminance information can be obtained for each pixel based on the time difference between the input time of the projection start signal S1 when a predetermined stripe pattern is projected and the occurrence time of the event data output after that input time.
- the luminance information obtained in this way can be used to measure the three-dimensional shape of the measurement object 50 by the phase shift method.
- the three-dimensional measuring device 10 only needs to obtain the occurrence time of the event data output after the input of the projection start signal S1, without obtaining the occurrence times of both positive polarity event data and negative polarity event data for all pixels. This shortens the processing time and further speeds up the generation of an image of the measurement object 50. In other words, the three-dimensional shape of the measurement object 50 can be measured more quickly.
- the luminance information (i.e., the luminance value) is obtained for each pixel based on the time difference (i.e., ON time) between the input time of the projection start signal S1 and the occurrence time of the event data output after the input time.
- the luminance information is not limited to this, and may be obtained for each pixel based on the OFF time. Specifically, after the start of projection of a predetermined stripe pattern, all the DMD elements of the projection unit 20 may change from the ON state to the OFF state at the same timing.
- the measurement unit 40 may obtain the luminance information (i.e., the luminance value) for each pixel based on the time difference (i.e., OFF time) between the input time of the signal for transition to the OFF state output after the projection start signal S1 and the occurrence time of the positive polarity event data output after the input time.
- the luminance information i.e., the luminance value
- the time difference i.e., OFF time
- the three-dimensional measuring device 10 moves while attached to the hand of a robot, and measures the three-dimensional shape of a measurement object that moves relatively.
- the three-dimensional measuring device 10 is not limited to this, and may be used, for example, in a fixed state to measure the three-dimensional shape of a measurement object that moves on a conveyor line.
- the projection unit 20, the imaging unit 30, and the measurement unit 40 may each be arranged separately as components of the three-dimensional measurement device 10.
- the measurement unit 40 may be an information processing terminal capable of wireless or wired communication with the projection unit 20 and the imaging unit 30.
- the predetermined stripe pattern is not limited to this, and may be composed of, for example, periodically changing light and dark parts.
- the three-dimensional measuring device 10 may also perform processing as shown in the flowchart of FIG. 9. Specifically, the three-dimensional measuring device 10 projects a stripe pattern and outputs a projection start signal (step S201). The three-dimensional measuring device 10 captures an image of the object onto which the stripe pattern is projected (step S203). The three-dimensional measuring device 10 generates a captured image from the event data output from the image sensor (step S205). The three-dimensional measuring device 10 calculates luminance information based on the time difference between the output of the projection start signal and the output of the event data (step S207). The three-dimensional measuring device 10 measures the three-dimensional shape based on the luminance information (step S209).
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
This three-dimensional measurement device comprises: a projection unit; an imaging unit that images an object on which a striped pattern is projected by the projection unit; a measurement unit that uses luminance information obtained from the captured image captured by the imaging unit to measure a three-dimensional shape of the object by a phase shift method; and a control unit that controls the projection unit. The imaging unit is provided with an imaging element that outputs event data including two-dimensional point data that specifies the position of a pixel that has changed in brightness when receiving light, and generates the captured image from the event data. The imaging element outputs event data of positive polarity when the brightness changes to become brighter, and outputs event data of negative polarity when the brightness changes to become darker. The control unit outputs a projection start signal at a projection start timing of the striped pattern. The measurement unit obtains the luminance information for each pixel in the captured image on the basis of a time difference between an output time of the projection start signal and an output time of event data after the output of the projection start signal.
Description
本開示は、計測対象物の三次元形状を計測する三次元計測装置に関する。
This disclosure relates to a three-dimensional measuring device that measures the three-dimensional shape of a measurement object.
従来、計測対象物の三次元形状等を計測する三次元計測装置として、例えば、位相シフト法を利用した装置が知られている。位相シフト法は、位相をずらした複数の縞パターン画像を投影することでこの複数の縞パターン画像を投影した計測対象物に関して三次元計測を行う手法である。このように位相シフト法を利用して三次元計測を行う技術として、下記特許文献1に開示されている三次元計測装置が知られている。この三次元計測装置は、各位相の縞を異なる波長の光に割り当て、これら縞を合成した縞パターン画像を計測対象物に投影し、この縞パターン画像を投影している計測対象物をカラーカメラで撮影する。そして、撮影した画像から各色成分を抽出して1回の撮影で位相算出を行う。これにより、三次元形状の計測に要する時間の短縮を図っている。
Conventionally, devices that use the phase shift method are known as three-dimensional measuring devices that measure the three-dimensional shape of a measurement object. The phase shift method is a technique for performing three-dimensional measurement of a measurement object onto which multiple stripe pattern images are projected by projecting multiple stripe pattern images with shifted phases. A three-dimensional measuring device disclosed in Patent Document 1 below is known as a technology for performing three-dimensional measurement using the phase shift method in this way. This three-dimensional measuring device assigns stripes of each phase to light of a different wavelength, projects a stripe pattern image composed of these stripes onto the measurement object, and photographs the measurement object onto which this stripe pattern image is projected with a color camera. Then, each color component is extracted from the photographed image and the phase is calculated in a single photograph. This aims to reduce the time required to measure the three-dimensional shape.
ところで、より高速に計測対象物の画像を生成する技術として、上記特許文献2に開示されるイベントカメラが知られている。このイベントカメラは、生物の網膜構造にヒントを得て開発された輝度値差分出力カメラである。イベントカメラは、画素ごとに輝度の変化を感知してその画素の座標、時間、そして輝度変化の極性をイベントデータとして出力する。このような構成により、イベントカメラは、従来のカメラが出力する輝度変化のない画素情報、つまり冗長なデータを出力しないといった特徴がある。そのため、データ通信量の軽減及び画像処理の軽量化等が実現されることで、より高速に計測対象物の画像を生成することができる。
Incidentally, the event camera disclosed in Patent Document 2 is known as a technology for generating images of measurement objects at higher speeds. This event camera is a brightness value differential output camera that was developed inspired by the retinal structure of living organisms. The event camera detects changes in brightness for each pixel and outputs the pixel's coordinates, time, and the polarity of the brightness change as event data. With this configuration, the event camera has the characteristic of not outputting pixel information with no brightness changes, which is output by conventional cameras, in other words, redundant data. This reduces the amount of data communication and lightens the image processing load, making it possible to generate images of measurement objects at higher speeds.
しかしながら、イベントカメラから出力されるイベントデータを用いて生成された計測対象物の撮像画像では、その撮像画像から画素単位での輝度変化の有無を把握できても輝度値を直接計測することができない。このため、輝度値を利用する位相シフト法などの三次元計測方法では、計測対象物の三次元形状の計測を行うことができない。
However, in the captured image of the object to be measured, which is generated using the event data output from the event camera, although it is possible to ascertain whether or not there is a change in brightness on a pixel-by-pixel basis from the captured image, it is not possible to directly measure the brightness value. For this reason, it is not possible to measure the three-dimensional shape of the object to be measured using three-dimensional measurement methods such as the phase shift method, which use brightness values.
本開示は、上述した課題を解決するためになされたものであり、その目的とするところは、イベントデータを利用して計測対象物の三次元形状の計測が可能な三次元計測装置を提供することにある。
The present disclosure has been made to solve the above-mentioned problems, and its purpose is to provide a three-dimensional measurement device that can measure the three-dimensional shape of a measurement target using event data.
本開示の一実施形態に係る三次元計測装置は、計測対象領域に対して所定の縞パターンを投影する投影部と、上記所定の縞パターンが投影された上記計測対象領域に配置された計測対象物を撮像する撮像部と、上記撮像部の撮像画像から求められる輝度情報を利用して位相シフト法により上記計測対象物の三次元形状を計測する計測部と、上記投影部を制御する制御部と、を備え、上記撮像部は、受光した際に輝度変化のあった画素の位置が特定される二次元点データを含むイベントデータを出力する撮像素子を備えて、上記撮像素子から出力されるイベントデータから上記撮像画像を生成し、上記撮像素子は、明るくなる輝度変化の場合に正極性のイベントデータを出力し、暗くなる輝度変化の場合に負極性のイベントデータを出力し、上記制御部は、上記所定の縞パターンの投影開始タイミングで投影開始信号を出力し、上記計測部は、上記撮像画像における各画素について、上記投影開始信号の出力時間と上記投影開始信号の出力後のイベントデータの出力時間との時間差に基づいて上記輝度情報を求める、三次元計測装置。
A three-dimensional measuring device according to an embodiment of the present disclosure includes a projection unit that projects a predetermined stripe pattern onto a measurement target area, an imaging unit that images a measurement target placed in the measurement target area onto which the predetermined stripe pattern is projected, a measurement unit that measures the three-dimensional shape of the measurement target by a phase shift method using luminance information obtained from the image captured by the imaging unit, and a control unit that controls the projection unit, the imaging unit includes an imaging element that outputs event data including two-dimensional point data that identifies the position of a pixel that has experienced a luminance change when light is received, and generates the captured image from the event data output from the imaging element, the imaging element outputs positive event data in the case of a luminance change that becomes brighter, and outputs negative event data in the case of a luminance change that becomes darker, the control unit outputs a projection start signal at the start timing of projection of the predetermined stripe pattern, and the measurement unit obtains the luminance information for each pixel in the captured image based on the time difference between the output time of the projection start signal and the output time of the event data after the output of the projection start signal.
撮像画像の各画素では、輝度値が高くなるほど、正極性のイベントデータの出力時間(以下、発生時間とも称する)と負極性のイベントデータの発生時間との時間差が長くなる。このように、輝度値と上記時間差とは相関があるため、撮像画像における画素単位での正極性のイベントデータの発生時間と負極性のイベントデータの発生時間との時間差に基づいて輝度情報を求めることができる。また、正極性のイベントデータの発生時間は、投影開始信号の出力時間に一致し、投影開始信号の出力時間後に負極性のイベントデータが出力される。このため、上記所定の縞パターンを投影した際の投影開始信号の出力時間と当該投影開始信号の出力時間後に出力されたイベントデータの発生時間との時間差に基づいて、各画素について輝度情報を求めることができる。このように求めた輝度情報を利用して位相シフト法により計測対象物の三次元形状を計測することができる。したがって、イベントデータを利用して計測対象物の三次元形状の計測が可能となる。特に、上記三次元計測装置は、正極性のイベントデータと負極性のイベントデータとの双方の発生時間を取得することなく、投影開始信号の入力後に出力されたイベントデータの発生時間のみを取得する。これにより、処理時間が短縮されて計測対象物の画像生成の高速化を図ることができる。すなわち、計測対象物の三次元形状をより高速に計測することができる。
In each pixel of the captured image, the higher the luminance value, the longer the time difference between the output time of the positive polarity event data (hereinafter also referred to as the occurrence time) and the occurrence time of the negative polarity event data. In this way, since there is a correlation between the luminance value and the above time difference, luminance information can be obtained based on the time difference between the occurrence time of the positive polarity event data and the occurrence time of the negative polarity event data in pixel units in the captured image. In addition, the occurrence time of the positive polarity event data coincides with the output time of the projection start signal, and the negative polarity event data is output after the output time of the projection start signal. Therefore, luminance information can be obtained for each pixel based on the time difference between the output time of the projection start signal when the above-mentioned specified stripe pattern is projected and the occurrence time of the event data output after the output time of the projection start signal. Using the luminance information obtained in this way, the three-dimensional shape of the measurement object can be measured by the phase shift method. Therefore, it is possible to measure the three-dimensional shape of the measurement object using the event data. In particular, the above-mentioned three-dimensional measuring device does not obtain the occurrence times of both the positive polarity event data and the negative polarity event data, but only obtains the occurrence time of the event data output after the input of the projection start signal. This shortens the processing time and speeds up image generation of the measurement object. In other words, the three-dimensional shape of the measurement object can be measured more quickly.
<第1実施形態>
以下、本開示の三次元計測装置を具現化した第1実施形態について、図面を参照して説明する。本実施形態に係る三次元計測装置10は、計測対象物50の三次元形状を計測する装置である。図1及び図2に示すように、制御部11と、投影部20と、撮像部30と、計測部40と、を備える。制御部11は、全体制御を司る。投影部20は、計測対象物50に対して位相シフト法用の所定の縞パターンを投影する。撮像部30は、所定の縞パターンが投影された計測対象物50を撮像する。計測部40は、この撮像画像から計測対象物50の三次元形状を計測する。このように構成される三次元計測装置10は、例えば、ロボットのハンドに組み付けられ、ハンドに対して高速に相対移動することになるワーク等の計測対象物50の三次元形状を計測する。ここで、相対移動とは、ロボットのハンドに組付けられた三次元計測装置10の移動と計測対象物50の移動との間での相対的な移動を指している。三次元計測装置10の位置が固定されている場合には相対移動は計測対象物50の移動となる。 First Embodiment
Hereinafter, a first embodiment of the three-dimensional measuring device of the present disclosure will be described with reference to the drawings. The three-dimensional measuring device 10 according to this embodiment is a device that measures the three-dimensional shape of a measurement object 50. As shown in FIG. 1 and FIG. 2, the device includes a control unit 11, a projection unit 20, an imaging unit 30, and a measurement unit 40. The control unit 11 is responsible for overall control. The projection unit 20 projects a predetermined stripe pattern for the phase shift method onto the measurement object 50. The imaging unit 30 captures the measurement object 50 onto which the predetermined stripe pattern is projected. The measurement unit 40 measures the three-dimensional shape of the measurement object 50 from the captured image. The three-dimensional measuring device 10 configured in this manner measures the three-dimensional shape of the measurement object 50, such as a workpiece, which is attached to the hand of a robot and moves relatively to the hand at high speed. Here, the relative movement refers to the relative movement between the movement of the three-dimensional measuring device 10 attached to the hand of the robot and the movement of the measurement object 50. When the position of the three-dimensional measuring apparatus 10 is fixed, the relative movement is the movement of the measurement object 50 .
以下、本開示の三次元計測装置を具現化した第1実施形態について、図面を参照して説明する。本実施形態に係る三次元計測装置10は、計測対象物50の三次元形状を計測する装置である。図1及び図2に示すように、制御部11と、投影部20と、撮像部30と、計測部40と、を備える。制御部11は、全体制御を司る。投影部20は、計測対象物50に対して位相シフト法用の所定の縞パターンを投影する。撮像部30は、所定の縞パターンが投影された計測対象物50を撮像する。計測部40は、この撮像画像から計測対象物50の三次元形状を計測する。このように構成される三次元計測装置10は、例えば、ロボットのハンドに組み付けられ、ハンドに対して高速に相対移動することになるワーク等の計測対象物50の三次元形状を計測する。ここで、相対移動とは、ロボットのハンドに組付けられた三次元計測装置10の移動と計測対象物50の移動との間での相対的な移動を指している。三次元計測装置10の位置が固定されている場合には相対移動は計測対象物50の移動となる。 First Embodiment
Hereinafter, a first embodiment of the three-dimensional measuring device of the present disclosure will be described with reference to the drawings. The three-dimensional measuring device 10 according to this embodiment is a device that measures the three-dimensional shape of a measurement object 50. As shown in FIG. 1 and FIG. 2, the device includes a control unit 11, a projection unit 20, an imaging unit 30, and a measurement unit 40. The control unit 11 is responsible for overall control. The projection unit 20 projects a predetermined stripe pattern for the phase shift method onto the measurement object 50. The imaging unit 30 captures the measurement object 50 onto which the predetermined stripe pattern is projected. The measurement unit 40 measures the three-dimensional shape of the measurement object 50 from the captured image. The three-dimensional measuring device 10 configured in this manner measures the three-dimensional shape of the measurement object 50, such as a workpiece, which is attached to the hand of a robot and moves relatively to the hand at high speed. Here, the relative movement refers to the relative movement between the movement of the three-dimensional measuring device 10 attached to the hand of the robot and the movement of the measurement object 50. When the position of the three-dimensional measuring apparatus 10 is fixed, the relative movement is the movement of the measurement object 50 .
なお、図2では、便宜上、13縞目まである所定の縞パターンを簡略化して図示している。より具体的には、一般的な縞パターンはサイン波パターンで表わされるので、縞パターンの明色部分と暗色部分は同様の幅となる。しかし、図2では、便宜上、暗色部分の幅を小さくして線で表わしている。かつ、縞の数も実施形態では13以上であるが、13に省略している。
In FIG. 2, for convenience, a specified stripe pattern having up to 13 stripes is illustrated in a simplified form. More specifically, a typical stripe pattern is represented as a sine wave pattern, so the light and dark parts of the stripe pattern have the same width. However, in FIG. 2, for convenience, the width of the dark parts is reduced and they are represented by lines. Also, although the number of stripes is 13 or more in the embodiment, it is abbreviated to 13.
三次元計測装置10は、図8に示されるように、ハードウェア構成として、プロセッサ101及びメモリ103を備える。例えば、三次元計測装置10は、マイクロコンピュータを備えてよい。マイクロコンピュータは、CPU(Central Processing Unit)、システムバス、入出力インタフェース、ROM(Read Only Memory)、RAM(Random Access Memory)、不揮発性メモリなどを備えてよい。メモリ103には、ロボット制御に関するプログラムに加えて、投影部20の制御に関するプログラム及び計測部40による三次元計測結果を利用した制御処理を実行するためのプログラム等が予め格納されている。上記ハードウェア構成により、制御部11及び計測部40の機能が実現されてよい。
As shown in FIG. 8, the three-dimensional measuring device 10 has a processor 101 and a memory 103 as a hardware configuration. For example, the three-dimensional measuring device 10 may have a microcomputer. The microcomputer may have a CPU (Central Processing Unit), a system bus, an input/output interface, a ROM (Read Only Memory), a RAM (Random Access Memory), a non-volatile memory, etc. In addition to a program related to robot control, the memory 103 pre-stores a program related to control of the projection unit 20 and a program for executing control processing using the three-dimensional measurement results by the measurement unit 40. The functions of the control unit 11 and the measurement unit 40 may be realized by the above hardware configuration.
投影部20は、いわゆるDLP(登録商標)プロジェクタである。投影部20は、制御部11により制御されて、光源からの光をDMD(Digital Micromirror Device)素子にて反射することで後述する所定の縞パターンを投影する。DMD素子は、スクリーンに投影された画像の各画素に対応する微細なミラーを備え、当該微細なミラーは、アレイ状に配置されている。DMD素子は、各ミラーの角度を変化させてスクリーンへ出射する光を、マイクロ秒単位で切り替える(即ちON/OFFする)。すなわち、投影部20は、複数のミラーをアレイ状に配置したDMDによる入射光の反射のON/OFFがミラーごとに制御部11によって制御されることで、所定の縞パターンを投影するように機能する。このため、制御部11は、各ミラーをONにしている時間とOFFにしている時間の比率によって、反射される光の階調(即ち明るさ)を変化させる。これにより、投影する画像の画像データに基づいた階調表示が可能になる。
The projection unit 20 is a so-called DLP (registered trademark) projector. The projection unit 20 is controlled by the control unit 11 to project a predetermined stripe pattern, which will be described later, by reflecting light from a light source with a DMD (Digital Micromirror Device) element. The DMD element has fine mirrors corresponding to each pixel of the image projected on the screen, and the fine mirrors are arranged in an array. The DMD element changes the angle of each mirror to switch (i.e., turn ON/OFF) the light emitted to the screen in microsecond units. That is, the projection unit 20 functions to project a predetermined stripe pattern by controlling the ON/OFF reflection of incident light by the DMD, which has multiple mirrors arranged in an array, for each mirror by the control unit 11. For this reason, the control unit 11 changes the gradation (i.e., brightness) of the reflected light depending on the ratio of the time each mirror is ON to the time it is OFF. This makes it possible to display gradations based on the image data of the image to be projected.
このような構成では、発光状態ごとに確保される単位時間内に1回発光される単パルス発光の発光時間が長くなるほどその発光状態が明るくなるため、発光時間に応じて発光状態を特定することができる。図2での各画素の座標について左上を(1、1)、右下を(k、l)とした場合、投影部20は、k×l画素(例えば、1140×912)に対応するミラーを備えている。また、例えば、DMD素子に入射する光として、R色(赤色)、G色(緑色)、B色(青色)が用意される場合を考える。この場合には、R色がミラーにて反射することで発光するR色発光状態とG色がミラーにて反射することで発光するG色発光状態とB色がミラーにて反射することで発光するB色発光状態とが短時間の所定の周期で繰り返される。それぞれの発光状態の発光時間が個別に調整されることで、カラー画像が投影可能となる。このため、制御部11は、単位時間中での反射のON/OFFタイミングを、上記所定の縞パターンに応じてミラーごとに設定する。
In such a configuration, the longer the light emission time of the single pulse light emitted once within the unit time secured for each light emission state, the brighter the light emission state becomes, so that the light emission state can be specified according to the light emission time. If the coordinates of each pixel in FIG. 2 are (1, 1) at the top left and (k, l) at the bottom right, the projection unit 20 is equipped with mirrors corresponding to k×l pixels (for example, 1140×912). Also, for example, consider a case where R (red), G (green), and B (blue) colors are prepared as light incident on the DMD element. In this case, an R light emission state in which R light is emitted by reflecting on the mirror, a G light emission state in which G light is emitted by reflecting on the mirror, and a B light emission state in which B light is emitted by reflecting on the mirror are repeated in a short, predetermined cycle. A color image can be projected by individually adjusting the light emission time of each light emission state. For this reason, the control unit 11 sets the ON/OFF timing of reflection within the unit time for each mirror according to the above-mentioned predetermined stripe pattern.
撮像部30は、いわゆるイベントカメラである。撮像部30は、受光した際に輝度変化のあった画素の位置が特定される二次元点データを含めたイベントデータ(具体的には、二次元点データ、時間、輝度変化の極性)を出力する撮像素子を備える。撮像部30は、当該撮像素子から出力されるイベントデータから撮像画像を生成する。このため、撮像部30では、撮像画像での各画素について、光を受光することで明るくなる輝度変化が生じると正極性(即ちプラス輝度変化)のイベントデータが出力され、その光が消えることで暗くなる輝度変化が生じて負極性(即ちマイナス輝度変化)のイベントデータが出力される。一定期間内に出力される複数のイベントデータの二次元点データをそれぞれ点として所定の平面にプロットすることで計測対象物50を撮像した画像データが生成される。撮像部30は、このように生成された画像データ又はイベントデータ(即ち二次元点データ、時間、輝度変化の極性)を計測部40に出力する。
The imaging unit 30 is a so-called event camera. The imaging unit 30 has an imaging element that outputs event data (specifically, two-dimensional point data, time, and polarity of brightness change) including two-dimensional point data that identifies the position of a pixel that has undergone a brightness change when light is received. The imaging unit 30 generates an image from the event data output from the imaging element. For this reason, the imaging unit 30 outputs event data of positive polarity (i.e., positive brightness change) when a brightness change occurs in which the pixel becomes brighter due to receiving light, and outputs event data of negative polarity (i.e., negative brightness change) when a brightness change occurs in which the pixel becomes darker due to the disappearance of the light. Image data of the measurement object 50 is generated by plotting the two-dimensional point data of multiple event data output within a certain period of time as points on a specified plane. The imaging unit 30 outputs the image data or event data (i.e., two-dimensional point data, time, and polarity of brightness change) generated in this way to the measurement unit 40.
計測部40は、制御部11により制御される。計測部40は、投影部20から予め決められた所定の縞パターンが投影されている計測対象物50を撮像部30により撮像した撮像画像に基づいて、位相シフト法によりその計測対象物50の三次元形状を計測する。
The measurement unit 40 is controlled by the control unit 11. The measurement unit 40 measures the three-dimensional shape of the measurement object 50 by the phase shift method based on an image captured by the imaging unit 30 of the measurement object 50 onto which a predetermined stripe pattern is projected from the projection unit 20.
一般的に、位相シフト法では、所定の縞パターン(即ち、第1の方向にて輝度が周期的に変化して第1の方向に直交する第2の方向にて輝度が変化しないパターン)が投影された計測対象物50の撮像画像である格子画像(即ち縞画像)に基づいて、その計測対象物50の表面形状に応じてゆがんだ値に相当する位相値θが求められる。位相シフト法では、下記の式(1)の輝度値I(x,y,n)から特定されるサイン波パターンが採用される。すなわち、位相シフト回数をNとしたとき、N枚の位相シフトされた格子画像(即ち縞画像)の輝度値I(x,y,n)が式(1)によって表される。
I(x,y,n)=a(x,y)cos{θ(x,y)+2πn/N}+b(x,y) ・・・(1)
ここで、点(x,y)は、格子画像内の1点(即ち1画素)である。a(x,y)は、輝度振幅である。b(x,y)は、背景輝度を示す。θ(x,y)は、n=0の格子の位相値を示すr。N個の格子画像の輝度値I(x,y,n)から求めた位相値θ(x,y)に応じて点(x,y)までの距離が測定される。 Generally, in the phase shift method, a phase value θ corresponding to a distorted value according to the surface shape of a measurement object 50 is obtained based on a grating image (i.e., a stripe image) which is an image of the measurement object 50 onto which a predetermined stripe pattern (i.e., a pattern in which the luminance changes periodically in a first direction and the luminance does not change in a second direction perpendicular to the first direction) is projected. In the phase shift method, a sine wave pattern specified from the luminance value I(x, y, n) of the following formula (1) is adopted. That is, when the number of phase shifts is N, the luminance values I(x, y, n) of N phase-shifted grating images (i.e., stripe images) are expressed by formula (1).
I (x, y, n) = a (x, y) cos {θ (x, y) + 2πn/N} + b (x, y) ... (1)
Here, point (x, y) is one point (i.e., one pixel) in the grid image. a(x, y) is the luminance amplitude. b(x, y) indicates background luminance. θ(x, y) indicates the phase value of the grid where n=0. The distance to point (x, y) is measured according to the phase value θ(x, y) calculated from the luminance values I(x, y, n) of the N grid images.
I(x,y,n)=a(x,y)cos{θ(x,y)+2πn/N}+b(x,y) ・・・(1)
ここで、点(x,y)は、格子画像内の1点(即ち1画素)である。a(x,y)は、輝度振幅である。b(x,y)は、背景輝度を示す。θ(x,y)は、n=0の格子の位相値を示すr。N個の格子画像の輝度値I(x,y,n)から求めた位相値θ(x,y)に応じて点(x,y)までの距離が測定される。 Generally, in the phase shift method, a phase value θ corresponding to a distorted value according to the surface shape of a measurement object 50 is obtained based on a grating image (i.e., a stripe image) which is an image of the measurement object 50 onto which a predetermined stripe pattern (i.e., a pattern in which the luminance changes periodically in a first direction and the luminance does not change in a second direction perpendicular to the first direction) is projected. In the phase shift method, a sine wave pattern specified from the luminance value I(x, y, n) of the following formula (1) is adopted. That is, when the number of phase shifts is N, the luminance values I(x, y, n) of N phase-shifted grating images (i.e., stripe images) are expressed by formula (1).
I (x, y, n) = a (x, y) cos {θ (x, y) + 2πn/N} + b (x, y) ... (1)
Here, point (x, y) is one point (i.e., one pixel) in the grid image. a(x, y) is the luminance amplitude. b(x, y) indicates background luminance. θ(x, y) indicates the phase value of the grid where n=0. The distance to point (x, y) is measured according to the phase value θ(x, y) calculated from the luminance values I(x, y, n) of the N grid images.
具体的には、例えば、上述したR色発光状態、G色発光状態、B色発光状態により形成される1周期で3つの格子画像が得られる場合を考える。この場合には、N=3として、R色発光状態での輝度値I(x,y,0)とG色発光状態での輝度値I(x,y,1)とB色発光状態での輝度値I(x,y,2)とが、撮像画像から求められる。この場合には、位相シフト法用の所定の縞パターンは、R色のみで構成されるサイン波パターンとG色のみで構成されるサイン波パターンとB色のみで構成されるサイン波パターンの位相がそれぞれ2π/3ずれるように構成される。
Specifically, for example, consider a case where three grating images are obtained in one period formed by the above-mentioned R, G, and B light emission states. In this case, N=3, and the luminance value I(x, y, 0) in the R light emission state, the luminance value I(x, y, 1) in the G light emission state, and the luminance value I(x, y, 2) in the B light emission state are obtained from the captured image. In this case, the specified stripe pattern for the phase shift method is configured so that the phases of the sine wave pattern consisting of only R, the sine wave pattern consisting of only G, and the sine wave pattern consisting of only B are shifted by 2π/3.
計測部40は、上記撮像画像における点(x,y)での輝度値I(x,y,0)、輝度値I(x,y,1)、輝度値I(x,y,2)が得られている場合には、上記式(1)を利用して位相値θ(x,y)を求める。計測部40は、このように求めた位相値θ(x,y)に応じて点(x,y)までの距離を測定する。このようにして撮像した計測対象物50の各点(x,y)の距離がそれぞれ測定されることで、その計測対象物50の三次元形状を計測することができる。
When the luminance values I(x,y,0), I(x,y,1), and I(x,y,2) at point (x,y) in the captured image are obtained, the measurement unit 40 uses the above formula (1) to obtain the phase value θ(x,y). The measurement unit 40 measures the distance to point (x,y) according to the phase value θ(x,y) thus obtained. By measuring the distance to each point (x,y) of the captured measurement object 50 in this manner, the three-dimensional shape of the measurement object 50 can be measured.
例えば、三次元計測装置10から図3の点P1までの距離を求める場合、計測部40は、投影部20より所定の縞パターンをN回シフトして投影した状態での撮像部30のN枚の撮影画像から、点P1の位相値θとその点P1が何縞目かという情報(即ち縞番号)とを求める。このように求めた位相値θ及び縞番号から投影部20での角度θp1と撮像部30での角度θc1とが求められる。投影部20と撮像部30との距離(即ち視差Dp)は既知であるため、三角測量により点P1の距離を求めることができる。同様に、図3の点P2の距離は、N枚の撮影画像から求めた点P2の位相値θ及び縞番号から求められる投影部20での角度θp2と撮像部30での角度θc2とに基づいて、三角測量により求めることができる。この計算を計測エリア全体で行うことにより、三次元計測を行うことができる。
For example, when determining the distance from the three-dimensional measuring device 10 to point P1 in FIG. 3, the measuring unit 40 determines the phase value θ of point P1 and information on which stripe the point P1 is on (i.e., stripe number) from N captured images of the imaging unit 30 in a state where a predetermined stripe pattern is shifted and projected by the projection unit 20 N times. From the phase value θ and stripe number thus determined, the angle θp1 at the projection unit 20 and the angle θc1 at the imaging unit 30 are determined. Since the distance between the projection unit 20 and the imaging unit 30 (i.e., parallax Dp) is known, the distance to point P1 can be determined by triangulation. Similarly, the distance to point P2 in FIG. 3 can be determined by triangulation based on the angle θp2 at the projection unit 20 and the angle θc2 at the imaging unit 30, which are determined from the phase value θ of point P2 determined from the N captured images and the stripe number. By performing this calculation over the entire measurement area, three-dimensional measurement can be performed.
ここで、位相シフト法を利用して計測対象物50の三次元形状を計測する際に、計測部40にて行われる三次元計測処理について、図4及び図5を参照して詳述する。
本実施形態では、高速に相対移動する計測対象物50を精度良く撮像するための撮像部として、イベントカメラを採用している。このような構成では、輝度変化があった画素に対応するイベントデータが出力される。しかし、そのイベントデータには輝度値が含まれないため、位相シフト法に必要な輝度値(即ちI(x,y,0)、I(x,y,1)、I(x,y,2))を直接取得できない。 Here, the three-dimensional measurement process performed by the measurement unit 40 when measuring the three-dimensional shape of the measurement object 50 using the phase shift method will be described in detail with reference to FIGS.
In this embodiment, an event camera is used as an imaging unit for accurately imaging the measurement target 50 that moves relatively at high speed. In such a configuration, event data corresponding to pixels where a luminance change occurs is output. However, since the event data does not include a luminance value, the luminance values (i.e., I(x,y,0), I(x,y,1), I(x,y,2)) required for the phase shift method cannot be directly obtained.
本実施形態では、高速に相対移動する計測対象物50を精度良く撮像するための撮像部として、イベントカメラを採用している。このような構成では、輝度変化があった画素に対応するイベントデータが出力される。しかし、そのイベントデータには輝度値が含まれないため、位相シフト法に必要な輝度値(即ちI(x,y,0)、I(x,y,1)、I(x,y,2))を直接取得できない。 Here, the three-dimensional measurement process performed by the measurement unit 40 when measuring the three-dimensional shape of the measurement object 50 using the phase shift method will be described in detail with reference to FIGS.
In this embodiment, an event camera is used as an imaging unit for accurately imaging the measurement target 50 that moves relatively at high speed. In such a configuration, event data corresponding to pixels where a luminance change occurs is output. However, since the event data does not include a luminance value, the luminance values (i.e., I(x,y,0), I(x,y,1), I(x,y,2)) required for the phase shift method cannot be directly obtained.
その一方で、発光開始のタイミングで正極性のイベントデータが出力された後に、発光終了のタイミングで負極性のイベントデータが出力される。そのため、正極性のイベントデータの出力と負極性のイベントデータの出力との時間差が長くなるほど明るくなる。すなわち、撮像画像の各画素では、輝度値が高くなるほど、正極性のイベントデータの発生時間と負極性のイベントデータの発生時間との時間差が長くなる。このため、図4に示すように、撮像画像における各画素での正極性のイベントデータの発生時間と負極性のイベントデータの発生時間との時間差(即ち図4のON時間)に基づいて、輝度値(即ち輝度情報)を求めることができる。なお、図4及び後述する図5では、正極性のイベントデータの出力を上向きの矢印にて示し、負極性のイベントデータの出力を下向きの矢印にて示している。
On the other hand, after positive polarity event data is output when light emission starts, negative polarity event data is output when light emission ends. Therefore, the longer the time difference between the output of positive polarity event data and the output of negative polarity event data, the brighter the image becomes. That is, for each pixel of the captured image, the higher the luminance value, the longer the time difference between the occurrence time of positive polarity event data and the occurrence time of negative polarity event data. For this reason, as shown in FIG. 4, the luminance value (i.e., luminance information) can be obtained based on the time difference between the occurrence time of positive polarity event data and the occurrence time of negative polarity event data at each pixel in the captured image (i.e., the ON time in FIG. 4). Note that in FIG. 4 and FIG. 5 described later, the output of positive polarity event data is indicated by an upward arrow, and the output of negative polarity event data is indicated by a downward arrow.
例えば、ある画素において、図5Aに例示するように、R色発光状態、G色発光状態、B色発光状態が所定の周期3Tp(単位時間Tp)で繰り返されている場合を想定する。このような発光状態では、R色発光開始のタイミングで正極性のイベントデータが発生して出力され(例えば図5Aのt11)、R色発光終了のタイミングで負極性のイベントデータが発生して出力される(例えば図5Aのt12)。その後、G色発光開始のタイミングで正極性のイベントデータが発生して出力され(例えば図5Aのt13)、G色発光終了のタイミングで負極性のイベントデータが発生して出力される(例えば図5Aのt14)。その後、B色発光開始のタイミングで正極性のイベントデータが発生して出力され(例えば図5Aのt15)、B色発光終了のタイミングで負極性のイベントデータが発生して出力される(例えば図5Aのt16)。
For example, assume that in a pixel, as illustrated in FIG. 5A, the R light emission state, the G light emission state, and the B light emission state are repeated at a predetermined cycle of 3Tp (unit time Tp). In such a light emission state, positive polarity event data is generated and output at the timing when the R light emission starts (e.g., t11 in FIG. 5A), and negative polarity event data is generated and output at the timing when the R light emission ends (e.g., t12 in FIG. 5A). After that, positive polarity event data is generated and output at the timing when the G light emission starts (e.g., t13 in FIG. 5A), and negative polarity event data is generated and output at the timing when the G light emission ends (e.g., t14 in FIG. 5A). After that, positive polarity event data is generated and output at the timing when the B light emission starts (e.g., t15 in FIG. 5A), and negative polarity event data is generated and output at the timing when the B light emission ends (e.g., t16 in FIG. 5A).
また、例えば、図5Aでの画素と異なる画素では、図5Bに例示するように、R色発光開始のタイミングで正極性のイベントデータが発生して出力され(例えば図5Bのt21)、R色発光終了のタイミングで負極性のイベントデータが発生して出力される(例えば図5Bのt22)。その後、G色発光開始のタイミングで正極性のイベントデータが発生して出力され(例えば図5Bのt23)、G色発光終了のタイミングで負極性のイベントデータが発生して出力される(例えば図5Bのt24)。その後、B色発光開始のタイミングで正極性のイベントデータが発生して出力され(例えば図5Bのt25)、B色発光終了のタイミングで負極性のイベントデータが発生して出力される(例えば図5Bのt26)。
Also, for example, in a pixel different from the pixel in FIG. 5A, as illustrated in FIG. 5B, positive polarity event data is generated and output at the start of R emission (e.g., t21 in FIG. 5B), and negative polarity event data is generated and output at the end of R emission (e.g., t22 in FIG. 5B). After that, positive polarity event data is generated and output at the start of G emission (e.g., t23 in FIG. 5B), and negative polarity event data is generated and output at the end of G emission (e.g., t24 in FIG. 5B). After that, positive polarity event data is generated and output at the start of B emission (e.g., t25 in FIG. 5B), and negative polarity event data is generated and output at the end of B emission (e.g., t26 in FIG. 5B).
ここで、R色発光開始のタイミングからR色発光終了のタイミングまでの時間が長くなるほどR色が明るくなる。そのため、R色発光開始のタイミングからR色発光終了のタイミングまでの時間に応じてR色の輝度値を求めることができる。同様に、G色発光開始のタイミングからG色発光終了のタイミングまでの時間に応じてG色の輝度値を求めることができる。B色発光開始のタイミングからB色発光終了のタイミングまでの時間に応じてB色の輝度値を求めることができる。
Here, the longer the time from when the R light emission starts to when the R light emission ends, the brighter the R light will be. Therefore, the R luminance value can be calculated based on the time from when the R light emission starts to when the R light emission ends. Similarly, the G luminance value can be calculated based on the time from when the G light emission starts to when the G light emission ends. The B luminance value can be calculated based on the time from when the B light emission starts to when the B light emission ends.
このため、撮像画像における各画素での正極性のイベントデータの発生時間と負極性のイベントデータの発生時間との時間差に基づいて輝度値(即ち輝度情報)を求めることができる。図5Aの例では、R色発光状態に関して正極性のイベントデータの発生時間と負極性のイベントデータの発生時間との時間差であるt12-t11に基づいてR色発光状態での輝度値I(x,y,0)を求めることができる。同様にして、時間差t14-t13及び時間差t16-t15に基づいて、G色発光状態での輝度値I(x,y,1)及びB色発光状態での輝度値I(x,y,2)を求めることができる。このようにして求めた各輝度値を利用して位相シフト法により計測対象物50の三次元形状を計測することができる。すなわち、イベントデータを利用して計測対象物50の三次元形状を計測することができる。なお、上記時間差に基づく輝度値の算出は、例えば、上記時間差と輝度値との対応関係に基づき行われてよい。
Therefore, it is possible to obtain a luminance value (i.e., luminance information) based on the time difference between the occurrence time of positive polarity event data and the occurrence time of negative polarity event data at each pixel in the captured image. In the example of FIG. 5A, it is possible to obtain a luminance value I(x, y, 0) in the R-color light emission state based on t12-t11, which is the time difference between the occurrence time of positive polarity event data and the occurrence time of negative polarity event data in the R-color light emission state. Similarly, it is possible to obtain a luminance value I(x, y, 1) in the G-color light emission state and a luminance value I(x, y, 2) in the B-color light emission state based on the time difference t14-t13 and the time difference t16-t15. Using each luminance value obtained in this way, it is possible to measure the three-dimensional shape of the measurement object 50 by the phase shift method. In other words, it is possible to measure the three-dimensional shape of the measurement object 50 using the event data. Note that the calculation of the luminance value based on the time difference may be performed, for example, based on the correspondence between the time difference and the luminance value.
次に、本実施形態の特徴的構成である所定の縞パターンの投影開始タイミングで制御部11から計測部40に出力される投影開始信号を利用することで三次元計測の更なる高速化を図る構成について、単位時間Tpでの投影を例に説明する。なお、本実施形態では、所定の縞パターンの投影開始時には、投影部20の各DMD素子は全てON状態になる。また、暗く投影するDMD素子ほどON状態からOFF状態までの時間(即ちON時間)が短くなる。
Next, a configuration for further speeding up three-dimensional measurement by utilizing a projection start signal output from the control unit 11 to the measurement unit 40 at the timing when projection of a specified stripe pattern starts, which is a characteristic configuration of this embodiment, will be described using an example of projection in unit time Tp. Note that in this embodiment, when projection of a specified stripe pattern starts, all DMD elements of the projection unit 20 are in the ON state. Also, the darker the DMD element that projects, the shorter the time from the ON state to the OFF state (i.e. the ON time).
本実施形態では、制御部11は、所定の縞パターンの投影を開始するための発光開始指示信号を投影部20に対して出力する。この発光開始指示信号が入力された投影部20は、所定の縞パターンの投影を開始する。制御部11は、図6に示すように、上記発光開始指示信号に同期する投影開始信号S1を計測部40及び撮像部30に出力する。また、制御部11は、所定の縞パターンの投影を終了するための発光終了指示信号を投影部20に対して出力する。この発光終了指示信号が入力された投影部20は、所定の縞パターンの投影を終了する。制御部11は、上記発光終了指示信号に同期する投影終了信号S2を計測部40及び撮像部30に出力する。
In this embodiment, the control unit 11 outputs a start light emission instruction signal to the projection unit 20 to start projecting a predetermined stripe pattern. The projection unit 20, upon receiving this start light emission instruction signal, starts projecting the predetermined stripe pattern. As shown in FIG. 6, the control unit 11 outputs a projection start signal S1 synchronized with the start light emission instruction signal to the measurement unit 40 and the imaging unit 30. The control unit 11 also outputs an end light emission instruction signal to the projection unit 20 to end the projection of the predetermined stripe pattern. The projection unit 20, upon receiving this end light emission instruction signal, ends the projection of the predetermined stripe pattern. The control unit 11 outputs a projection end signal S2 synchronized with the end light emission instruction signal to the measurement unit 40 and the imaging unit 30.
撮像部30は、制御部11から投影開始信号S1が入力されたタイミングで、撮像を開始し、その後の投影終了信号S2が入力されたタイミングで、撮像を終了する。このため、撮像部30では、投影開始信号S1が入力されたタイミングで、全ての画素で正極性のイベントデータが出力される。その後、撮像部30では、投影終了信号S2が入力されるまでに、明るい画素ほど遅いタイミングで負極性のイベントデータが出力される。なお、撮像部30は、投影終了信号S2を使用せずに、投影開始信号S1の入力時間(即ち、投影開始信号S1の出力時間)から指定時間が経過した後に撮像を終了してもよい。
The imaging unit 30 starts imaging when the projection start signal S1 is input from the control unit 11, and ends imaging when the projection end signal S2 is subsequently input. Therefore, in the imaging unit 30, positive polarity event data is output for all pixels when the projection start signal S1 is input. Thereafter, in the imaging unit 30, negative polarity event data is output later for brighter pixels before the projection end signal S2 is input. Note that the imaging unit 30 may end imaging after a specified time has elapsed from the input time of the projection start signal S1 (i.e., the output time of the projection start signal S1) without using the projection end signal S2.
このため、計測部40では、制御部11から投影開始信号S1が入力されたタイミングで、全ての画素で正極性のイベントデータが撮像部30から入力される。その後、画素ごとに異なるタイミングで、負極性のイベントデータが撮像部30から入力される。
For this reason, in the measurement unit 40, positive polarity event data is input from the imaging unit 30 for all pixels at the timing when the projection start signal S1 is input from the control unit 11. After that, negative polarity event data is input from the imaging unit 30 at different timings for each pixel.
すなわち、正極性のイベントデータの発生時間は、投影開始信号S1の入力時間に一致し、投影開始信号S1の入力時間後に負極性のイベントデータが出力される。このため、上記所定の縞パターンを投影した際の投影開始信号S1の入力時間と当該入力時間後に出力されたイベントデータの発生時間との時間差に基づいて、各画素について輝度情報を求めることができる。
In other words, the occurrence time of positive polarity event data coincides with the input time of the projection start signal S1, and negative polarity event data is output after the input time of the projection start signal S1. Therefore, it is possible to obtain brightness information for each pixel based on the time difference between the input time of the projection start signal S1 when the above-mentioned specified stripe pattern is projected and the occurrence time of the event data output after that input time.
例えば、図7Aに示すように作成された縞パターンが投影部20から投影された場合を考える。この場合、投影開始信号S1が入力された計測部40では、図7Bに示すように、撮像部30からの正極性のイベントデータの入力時間を取得することなく、投影開始信号S1の入力時間と当該入力時間後に出力された負極性のイベントデータの発生時間との時間差(即ち図7BでのON時間)に基づいて、各画素について輝度情報を求めることができる。
For example, consider the case where a stripe pattern created as shown in FIG. 7A is projected from the projection unit 20. In this case, the measurement unit 40 to which the projection start signal S1 is input can obtain brightness information for each pixel based on the time difference between the input time of the projection start signal S1 and the occurrence time of the negative polarity event data output after that input time (i.e., the ON time in FIG. 7B), without obtaining the input time of the positive polarity event data from the imaging unit 30, as shown in FIG. 7B.
また、図5Aの例であれば、単位時間Tpが既知であるため、R色発光開始のタイミングでの正極性のイベントデータの入力時間t11、G色発光開始のタイミングでの正極性のイベントデータの入力時間t13、B色発光開始のタイミングでの正極性のイベントデータの入力時間t15の取得を不要とすることができる。即ち、投影開始信号S1の入力時間とR色発光開始のタイミングでの負極性のイベントデータの入力時間t12、G色発光開始のタイミングでの負極性のイベントデータの入力時間t14、B色発光開始のタイミングでの負極性のイベントデータの入力時間t16との時間差、及び単位時間Tpに基づいて、各画素について輝度情報を求めることができる。
In the example of FIG. 5A, since the unit time Tp is known, it is not necessary to acquire the input time t11 of the positive polarity event data at the start of R light emission, the input time t13 of the positive polarity event data at the start of G light emission, and the input time t15 of the positive polarity event data at the start of B light emission. In other words, it is possible to obtain luminance information for each pixel based on the time difference between the input time of the projection start signal S1 and the input time t12 of the negative polarity event data at the start of R light emission, the input time t14 of the negative polarity event data at the start of G light emission, and the input time t16 of the negative polarity event data at the start of B light emission, as well as the unit time Tp.
以上説明したように、本実施形態に係る三次元計測装置10では、投影部20から所定の縞パターンが投影された計測対象物50を撮像する撮像部30は、受光した際に輝度変化のあった画素の位置が特定される二次元点データを含めたイベントデータを出力する撮像素子を備えて、撮像素子から出力されるイベントデータから撮像画像を生成する。この撮像素子は、明るくなる輝度変化の場合に正極性のイベントデータを出力し、暗くなる輝度変化の場合に負極性のイベントデータを出力する。制御部11は、所定の縞パターンの投影開始タイミングで投影開始信号S1を計測部40に対して出力し、計測部40は、撮像画像における各画素について、制御部11からの投影開始信号S1の入力時間と当該入力時間後に出力されたイベントデータの発生時間との時間差に基づいて輝度情報を求める。
As described above, in the three-dimensional measuring device 10 according to this embodiment, the imaging unit 30, which images the measurement object 50 onto which a predetermined stripe pattern is projected from the projection unit 20, is equipped with an imaging element that outputs event data including two-dimensional point data that identifies the position of pixels that have experienced a luminance change when light is received, and generates an image from the event data output from the imaging element. This imaging element outputs positive event data in the case of a brighter luminance change, and outputs negative event data in the case of a darker luminance change. The control unit 11 outputs a projection start signal S1 to the measurement unit 40 at the start of projection of the predetermined stripe pattern, and the measurement unit 40 calculates luminance information for each pixel in the captured image based on the time difference between the input time of the projection start signal S1 from the control unit 11 and the occurrence time of the event data output after that input time.
このように、所定の縞パターンを投影した際の投影開始信号S1の入力時間と当該入力時間後に出力されたイベントデータの発生時間との時間差に基づいて、各画素について輝度情報を求めることができる。このように求めた輝度情報を利用して位相シフト法により計測対象物50の三次元形状を計測することができる。特に、三次元計測装置10は、全ての画素について正極性のイベントデータと負極性のイベントデータとの双方の発生時間を取得することなく、投影開始信号S1の入力後に出力されたイベントデータの発生時間のみを取得すればよい。そのため、処理時間が短縮されてさらなる計測対象物50の画像生成の高速化を図ることができる。すなわち、計測対象物50の三次元形状をより高速に計測することができる。
In this way, luminance information can be obtained for each pixel based on the time difference between the input time of the projection start signal S1 when a predetermined stripe pattern is projected and the occurrence time of the event data output after that input time. The luminance information obtained in this way can be used to measure the three-dimensional shape of the measurement object 50 by the phase shift method. In particular, the three-dimensional measuring device 10 only needs to obtain the occurrence time of the event data output after the input of the projection start signal S1, without obtaining the occurrence times of both positive polarity event data and negative polarity event data for all pixels. This shortens the processing time and further speeds up the generation of an image of the measurement object 50. In other words, the three-dimensional shape of the measurement object 50 can be measured more quickly.
なお、本開示は、上記実施形態に限定されるものではなく、例えば、以下のように具体化してもよい。
(1)上記実施形態では、輝度情報(即ち輝度値)は、投影開始信号S1の入力時間と当該入力時間後に出力されたイベントデータの発生時間との時間差(即ちON時間)に基づいて各画素について求められる。しかし、輝度情報は、これに限らず、OFF時間に基づいて各画素について求められてもよい。具体的には、所定の縞パターンの投影開始後に、投影部20の全てのDMD素子は、同じタイミングでON状態からOFF状態になってよい。計測部40は、投影開始信号S1の後に出力されるOFF状態への遷移のための信号の入力時間と当該入力時間後に出力された正極性のイベントデータの発生時間との時間差(即ちOFF時間)に基づいて、各画素について輝度情報(即ち輝度値)を求めてもよい。 The present disclosure is not limited to the above-described embodiment, and may be embodied as follows, for example.
(1) In the above embodiment, the luminance information (i.e., the luminance value) is obtained for each pixel based on the time difference (i.e., ON time) between the input time of the projection start signal S1 and the occurrence time of the event data output after the input time. However, the luminance information is not limited to this, and may be obtained for each pixel based on the OFF time. Specifically, after the start of projection of a predetermined stripe pattern, all the DMD elements of the projection unit 20 may change from the ON state to the OFF state at the same timing. The measurement unit 40 may obtain the luminance information (i.e., the luminance value) for each pixel based on the time difference (i.e., OFF time) between the input time of the signal for transition to the OFF state output after the projection start signal S1 and the occurrence time of the positive polarity event data output after the input time.
(1)上記実施形態では、輝度情報(即ち輝度値)は、投影開始信号S1の入力時間と当該入力時間後に出力されたイベントデータの発生時間との時間差(即ちON時間)に基づいて各画素について求められる。しかし、輝度情報は、これに限らず、OFF時間に基づいて各画素について求められてもよい。具体的には、所定の縞パターンの投影開始後に、投影部20の全てのDMD素子は、同じタイミングでON状態からOFF状態になってよい。計測部40は、投影開始信号S1の後に出力されるOFF状態への遷移のための信号の入力時間と当該入力時間後に出力された正極性のイベントデータの発生時間との時間差(即ちOFF時間)に基づいて、各画素について輝度情報(即ち輝度値)を求めてもよい。 The present disclosure is not limited to the above-described embodiment, and may be embodied as follows, for example.
(1) In the above embodiment, the luminance information (i.e., the luminance value) is obtained for each pixel based on the time difference (i.e., ON time) between the input time of the projection start signal S1 and the occurrence time of the event data output after the input time. However, the luminance information is not limited to this, and may be obtained for each pixel based on the OFF time. Specifically, after the start of projection of a predetermined stripe pattern, all the DMD elements of the projection unit 20 may change from the ON state to the OFF state at the same timing. The measurement unit 40 may obtain the luminance information (i.e., the luminance value) for each pixel based on the time difference (i.e., OFF time) between the input time of the signal for transition to the OFF state output after the projection start signal S1 and the occurrence time of the positive polarity event data output after the input time.
(2)上記実施形態では、三次元計測装置10は、ロボットのハンドに組み付けられた状態で移動して、相対移動する計測対象物の三次元形状を計測する。しかし、三次元計測装置10は、これに限らず、例えば、固定状態で使用されて、搬送ライン上を移動する計測対象物の三次元形状を計測してもよい。
(2) In the above embodiment, the three-dimensional measuring device 10 moves while attached to the hand of a robot, and measures the three-dimensional shape of a measurement object that moves relatively. However, the three-dimensional measuring device 10 is not limited to this, and may be used, for example, in a fixed state to measure the three-dimensional shape of a measurement object that moves on a conveyor line.
(3)投影部20及び撮像部30と計測部40とは、それぞれ三次元計測装置10を構成する別体に配置されてもよい。また、計測部40は、投影部20及び撮像部30と無線通信又は有線通信可能な情報処理端末であってもよい。
(3) The projection unit 20, the imaging unit 30, and the measurement unit 40 may each be arranged separately as components of the three-dimensional measurement device 10. In addition, the measurement unit 40 may be an information processing terminal capable of wireless or wired communication with the projection unit 20 and the imaging unit 30.
(4)上記実施形態では、投影部20からN回シフトして投影される所定の縞パターンは、N=3であってR色発光状態とG色発光状態とB色発光状態とによって構成される。しかし、所定の縞パターンは、これに限らず、例えば、周期的に変化する明色部分及び暗色部分によって構成されてもよい。
(4) In the above embodiment, the predetermined stripe pattern projected by the projection unit 20 with N shifts is composed of R, G, and B light emission states, where N=3. However, the predetermined stripe pattern is not limited to this, and may be composed of, for example, periodically changing light and dark parts.
また、三次元計測装置10は、図9のフローチャートが示すような処理を行ってもよい。具体的には、三次元計測装置10は、縞パターンを投影し、投影開始信号を出力する(ステップS201)。三次元計測装置10は、縞パターンが投影された物体を撮像する(ステップS203)。三次元計測装置10は、撮像素子から出力されるイベントデータから撮像画像を生成する(ステップS205)。三次元計測装置10は、投影開始信号の出力とイベントデータの出力との時間差に基づき輝度情報を算出する(ステップS207)。三次元計測装置10は、輝度情報に基づき三次元形状を計測する(ステップS209)。
The three-dimensional measuring device 10 may also perform processing as shown in the flowchart of FIG. 9. Specifically, the three-dimensional measuring device 10 projects a stripe pattern and outputs a projection start signal (step S201). The three-dimensional measuring device 10 captures an image of the object onto which the stripe pattern is projected (step S203). The three-dimensional measuring device 10 generates a captured image from the event data output from the image sensor (step S205). The three-dimensional measuring device 10 calculates luminance information based on the time difference between the output of the projection start signal and the output of the event data (step S207). The three-dimensional measuring device 10 measures the three-dimensional shape based on the luminance information (step S209).
Claims (3)
- 計測対象領域に対して所定の縞パターンを投影する投影部と、
前記所定の縞パターンが投影された前記計測対象領域に配置された計測対象物を撮像する撮像部と、
前記撮像部の撮像画像から求められる輝度情報を利用して位相シフト法により前記計測対象物の三次元形状を計測する計測部と、
前記投影部を制御する制御部と、
を備える三次元計測装置であって、
前記撮像部は、受光した際に輝度変化のあった画素の位置が特定される二次元点データを含むイベントデータを出力する撮像素子を備えて、前記撮像素子から出力されるイベントデータから前記撮像画像を生成し、
前記撮像素子は、明るくなる輝度変化の場合に正極性のイベントデータを出力し、暗くなる輝度変化の場合に負極性のイベントデータを出力し、
前記制御部は、前記所定の縞パターンの投影開始タイミングで投影開始信号を出力し、
前記計測部は、前記撮像画像における各画素について、前記投影開始信号の出力時間と前記投影開始信号の出力後のイベントデータの出力時間との時間差に基づいて前記輝度情報を求める、
三次元計測装置。 a projection unit that projects a predetermined stripe pattern onto a measurement target area;
an imaging unit that images a measurement object disposed in the measurement object area onto which the predetermined stripe pattern is projected;
a measurement unit that measures a three-dimensional shape of the measurement object by a phase shift method using luminance information obtained from an image captured by the imaging unit;
A control unit that controls the projection unit;
A three-dimensional measuring apparatus comprising:
the imaging unit includes an imaging element that outputs event data including two-dimensional point data that identifies a position of a pixel that has experienced a luminance change when light is received, and generates the captured image from the event data output from the imaging element;
the imaging device outputs positive event data when the luminance changes to brighter, and outputs negative event data when the luminance changes to darker;
the control unit outputs a projection start signal at a projection start timing of the predetermined stripe pattern,
the measurement unit determines the luminance information for each pixel in the captured image based on a time difference between an output time of the projection start signal and an output time of event data after the output of the projection start signal.
Three-dimensional measuring device. - 前記投影部は、前記所定の縞パターンを投影するための前記撮像画像における複数の画素にそれぞれ対応する複数のミラーを備え、前記投影開始タイミングで、前記複数のミラーの全てに投影光を反射させ、
前記時間差は、前記投影開始信号の出力時間と前記投影開始信号の出力後の負極性のイベントデータの出力時間との時間差である、
請求項1に記載の三次元計測装置。 the projection unit includes a plurality of mirrors corresponding to a plurality of pixels in the captured image for projecting the predetermined stripe pattern, and reflects projection light on all of the plurality of mirrors at the projection start timing;
the time difference is a time difference between an output time of the projection start signal and an output time of negative polarity event data after the output of the projection start signal;
The three-dimensional measuring apparatus according to claim 1 . - 前記投影部は、前記所定の縞パターンを投影するための前記撮像画像における複数の画素にそれぞれ対応する複数のミラーを備え、前記投影開始タイミングにて、前記複数のミラーの全ての投影光の反射を止め、
前記時間差は、前記投影開始信号の出力時間と前記投影開始信号の出力後の正極性のイベントデータの出力時間との時間差である、
請求項1に記載の三次元計測装置。
the projection unit includes a plurality of mirrors corresponding to a plurality of pixels in the captured image for projecting the predetermined stripe pattern, and stops reflection of the projection light from all of the plurality of mirrors at the projection start timing;
the time difference is a time difference between an output time of the projection start signal and an output time of positive polarity event data after the output of the projection start signal;
The three-dimensional measuring apparatus according to claim 1 .
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023005669A JP2024101649A (en) | 2023-01-18 | 2023-01-18 | Three-dimensional measurement apparatus |
JP2023-005669 | 2023-01-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024154528A1 true WO2024154528A1 (en) | 2024-07-25 |
Family
ID=91955736
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/045741 WO2024154528A1 (en) | 2023-01-18 | 2023-12-20 | Three-dimensional measurement device |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2024101649A (en) |
WO (1) | WO2024154528A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015097149A1 (en) * | 2013-12-23 | 2015-07-02 | Universität Zürich | Method for reconstructing a surface using spatially structured light and a dynamic vision sensor |
CN112525107A (en) * | 2020-11-24 | 2021-03-19 | 革点科技(深圳)有限公司 | Structured light three-dimensional measurement method based on event camera |
WO2021085419A1 (en) * | 2019-10-28 | 2021-05-06 | 株式会社デンソーウェーブ | Three-dimensional-measurement device |
WO2022209151A1 (en) * | 2021-03-30 | 2022-10-06 | 株式会社デンソーウェーブ | Three-dimensional measurement device |
-
2023
- 2023-01-18 JP JP2023005669A patent/JP2024101649A/en active Pending
- 2023-12-20 WO PCT/JP2023/045741 patent/WO2024154528A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015097149A1 (en) * | 2013-12-23 | 2015-07-02 | Universität Zürich | Method for reconstructing a surface using spatially structured light and a dynamic vision sensor |
WO2021085419A1 (en) * | 2019-10-28 | 2021-05-06 | 株式会社デンソーウェーブ | Three-dimensional-measurement device |
CN112525107A (en) * | 2020-11-24 | 2021-03-19 | 革点科技(深圳)有限公司 | Structured light three-dimensional measurement method based on event camera |
WO2022209151A1 (en) * | 2021-03-30 | 2022-10-06 | 株式会社デンソーウェーブ | Three-dimensional measurement device |
Also Published As
Publication number | Publication date |
---|---|
JP2024101649A (en) | 2024-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7371443B2 (en) | 3D measuring device | |
JP6601790B2 (en) | Real-time measurement and projection apparatus and three-dimensional projection and measurement apparatus | |
TWI641802B (en) | Three-dimensional measuring device | |
CN108076332B (en) | System and method for digital black level blending | |
CN112399028B (en) | Depth sensing system and method | |
JP2003202216A (en) | Method, device, system and program for three-dimensional image processing | |
EP3934244B1 (en) | Device, system and method for generating a mapping of projector pixels to camera pixels and/or object positions using alternating patterns | |
WO2024154528A1 (en) | Three-dimensional measurement device | |
KR20200026643A (en) | Method and apparatus for obtaining 3 dimentional image | |
WO2022209151A1 (en) | Three-dimensional measurement device | |
JP2016122186A (en) | Projection apparatus and projection method | |
JP4944633B2 (en) | Three-dimensional shape measuring apparatus and three-dimensional shape measuring method | |
WO2024157677A1 (en) | Three-dimensional measurement device | |
EP4495541A1 (en) | Three-dimensional measurement device | |
JP2024109168A (en) | 3D measuring device | |
WO2024162409A1 (en) | Three-dimensional measurement device | |
WO2025094644A1 (en) | Three-dimensional measuring device | |
JP2021177218A (en) | Image projection system control method and image projection system | |
JP2025085905A (en) | 3D measuring device | |
JP2024129318A (en) | Three-dimensional measurement device | |
JP2024101649A5 (en) | ||
JP2025078413A (en) | 3D measuring device | |
JP2024120305A (en) | 3D measuring device | |
JP2024105049A5 (en) | ||
WO2023084790A1 (en) | Three-dimensional measurement system, control method for same, and control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23917739 Country of ref document: EP Kind code of ref document: A1 |