[go: up one dir, main page]

WO2015034048A1 - Three-dimensional shape measurement method and three-dimensional shape measurement device - Google Patents

Three-dimensional shape measurement method and three-dimensional shape measurement device Download PDF

Info

Publication number
WO2015034048A1
WO2015034048A1 PCT/JP2014/073495 JP2014073495W WO2015034048A1 WO 2015034048 A1 WO2015034048 A1 WO 2015034048A1 JP 2014073495 W JP2014073495 W JP 2014073495W WO 2015034048 A1 WO2015034048 A1 WO 2015034048A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
dark
code
dimensional shape
pattern
Prior art date
Application number
PCT/JP2014/073495
Other languages
French (fr)
Japanese (ja)
Inventor
宏 涌田
Original Assignee
アルプス電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アルプス電気株式会社 filed Critical アルプス電気株式会社
Publication of WO2015034048A1 publication Critical patent/WO2015034048A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings

Definitions

  • the present invention captures a measurement target onto which pattern light has been projected from a projection apparatus such as a projector with an imaging apparatus such as a camera, and measures a three-dimensional shape of the measurement target by image processing and 3
  • the present invention relates to a three-dimensional shape measuring apparatus.
  • a spatial coding method is known as a method for measuring a three-dimensional shape of a measurement target by projecting and photographing a pattern once.
  • this spatial coding method a three-dimensional shape for reconstructing a three-dimensional shape by projecting a pattern in which black and white graphic codes having a circular shape are arranged in a two-dimensional series onto a measurement target and photographing the measurement target A measurement method has been proposed (see, for example, Patent Document 1).
  • the present invention has been made in view of the above points, and a three-dimensional shape measurement method capable of measuring a three-dimensional shape of a measurement object by one projection and photographing operation of pattern light while ensuring spatial resolution. And it aims at providing a three-dimensional shape measuring device.
  • the three-dimensional shape measurement method of the present invention includes a projecting step of projecting pattern light on which a plurality of light and dark codes are arranged onto a measurement object, a photographing step of photographing the measurement object onto which the pattern light is projected, and the measurement object
  • a three-dimensional shape measuring method comprising: a dimension position calculating step, wherein the light / dark code includes a central element and one or a plurality of peripheral elements arranged around the central element, and the arrangement of the peripheral elements Thus, the type of the light / dark code is identified.
  • the central element and the peripheral elements arranged around the central element are used for the pattern light used when measuring the three-dimensional shape of the measurement target by projecting and photographing the pattern once.
  • the type of the light / dark code is identified by the arrangement of the peripheral elements.
  • chord for a detection and identification can be reduced.
  • the number of codes that can be detected per unit area by the image sensor can be increased, and the spatial resolution can be improved.
  • the pattern light may include a code group in which a predetermined number of the light and dark codes are arranged in the first direction as a structural unit, and the structural unit is in the first direction. It is preferable that the pattern is repeatedly arranged in a second direction perpendicular to the vertical direction.
  • the pattern light includes a code group in which a predetermined number of the light and dark codes are arranged in a first direction parallel to the epipolar plane, and the structural unit includes the first direction and the epipolar plane. It is preferable that the pattern is configured by being repeatedly arranged in a second direction perpendicular to the first direction.
  • the two directions orthogonal to the epipolar plane (such as a two-dimensional array pattern)
  • the number of codes can be reduced as compared with the case of encoding in a direction parallel to and perpendicular to the epipolar plane, the number of bits necessary to represent all codes can be reduced.
  • the code when the code moves in the direction parallel to the epipolar plane (first direction) after projection, it can be distinguished from the same code in the adjacent code group (disparity between the parallax deviation between the projection pattern and the shooting pattern)
  • the discriminable width can be long with a small number of bits. As a result, a long measurement range in the depth direction can be secured, and an increase in calculation cost can be suppressed.
  • the pattern light has the common peripheral element disposed between the central elements of the adjacent light and dark codes, and the adjacent light and dark codes are continuously connected. It is preferable that In this case, since neighboring elements are shared by adjacent light and dark codes, the density of the light and dark codes arranged in the pattern light can be increased. Thereby, the number of codes that can be detected by the image sensor can be increased, and the spatial resolution can be further improved.
  • the central element of the light / dark code is extracted and the position information of the light / dark code is acquired from the position of the central element.
  • the position information of the light / dark code is acquired based on the position of the central element, it is not necessary to recognize the entire shape of the light / dark code, and the processing for this can be omitted, thereby further suppressing an increase in calculation cost. Is possible.
  • the light / dark code has four positions where the peripheral elements are arranged around the central element.
  • the peripheral elements since four peripheral elements can be arranged around the central element, it is possible to realize a 4-bit code with one light / dark code depending on the presence or absence of these peripheral elements.
  • the light and dark code may be arranged such that the peripheral element is oblique to the arrangement direction of the light and dark code when viewed from the central element.
  • the peripheral elements are arranged so as to be oblique to the light / dark code arrangement direction when viewed from the central element, the direction of the peripheral elements is parallel or perpendicular to the light / dark code arrangement direction.
  • the length of the peripheral elements can be ensured as compared with the case where they are arranged in any direction. Thereby, it becomes possible to improve the discriminability of the presence / absence of peripheral elements in the light / dark code.
  • the pattern light projected in the projection step has a wavelength other than visible light.
  • an invisible wavelength band such as infrared light for the pattern light
  • it is possible to measure a three-dimensional shape without hindering the action of a person or the like to be measured for example, driving operation of a vehicle or the like).
  • the three-dimensional shape measurement apparatus of the present invention includes a projection unit that projects pattern light on which a plurality of light and dark codes are arranged onto a measurement target, an imaging unit that captures the measurement target on which the pattern light is projected, and the measurement target. 3 based on the extraction unit for extracting the light / dark code from the photographed image, the identification unit for identifying the type of the light / dark code extracted by the extraction unit, and the position information of each light / dark code identified by the identification unit.
  • a three-dimensional shape measuring apparatus comprising: a calculation unit that calculates a dimensional position, wherein the light / dark code includes a central element and one or a plurality of peripheral elements arranged around the central element; The type of the light / dark code is identified by the arrangement of the elements.
  • the central element and the peripheral elements arranged around the central element are used for pattern light used when measuring the three-dimensional shape of the measurement target by projecting and photographing the pattern once.
  • a plurality of light / dark codes consisting of are arranged, and the type of the light / dark code is identified by the arrangement of the peripheral elements. For this reason, compared with the case where a figure code is used, the number of pixels required for one code
  • the pattern light projected by the projection unit includes a code group in which a predetermined number of the plurality of bright and dark codes are arranged in a first direction as a structural unit, and the structural unit is the A pattern that is repeatedly arranged in a second direction perpendicular to the first direction is preferable.
  • the pattern light projected by the projection unit includes a code group in which a predetermined number of light and dark codes are arranged in a first direction parallel to the epipolar plane, and the structural unit is the first light. It is preferable that the pattern is configured by being repeatedly arranged in the first direction and the second direction perpendicular to the epipolar plane.
  • the pattern light is encoded in two directions orthogonal to the epipolar plane (a direction parallel to and perpendicular to the epipolar plane) as in a two-dimensional array pattern
  • the encoding is performed only in the direction parallel to the surface (first direction)
  • the maximum discriminable width that can discriminate the parallax deviation between the projection pattern and the imaging pattern can be increased with a small number of bits.
  • a long measurement range in the depth direction can be secured, and an increase in calculation cost can be suppressed.
  • the present invention it is possible to measure the three-dimensional shape of the measurement object by one projection and photographing operation of the pattern light while ensuring the spatial resolution.
  • the present invention relates to a peripheral element that is arranged around a central element and a central element in the pattern light in a spatial coding method of capturing a measurement target onto which the pattern light is projected and measuring a three-dimensional shape based on the captured image.
  • a spatial coding method of capturing a measurement target onto which the pattern light is projected and measuring a three-dimensional shape based on the captured image are arranged, and the type of the light / dark code is identified by the arrangement of peripheral elements, thereby reducing the number of pixels required for one code and ensuring spatial resolution.
  • FIG. 1 is a block diagram showing a configuration of a three-dimensional shape measuring apparatus according to an embodiment of the present invention. Note that the three-dimensional shape measuring apparatus shown in FIG. 1 is simplified for explaining the present invention.
  • the three-dimensional shape measuring apparatus according to the present invention is not limited to the structure shown in FIG. 1, but is a structure necessary for measuring a three-dimensional shape (for example, a display device for confirming a photographed measurement object) Etc.) may be provided separately.
  • the three-dimensional shape measurement apparatus according to the present embodiment is applied to, for example, a gesture motion input apparatus that is mounted on an automobile and receives an operation of a desired device by detecting the movement of a driver's hand or finger.
  • the three-dimensional shape measurement apparatus 10 is configured to project a projection apparatus 11 that projects pattern light onto a measurement object 20, an imaging apparatus 12 that captures the measurement object 20, and a captured image captured by the imaging apparatus 12. And a processing device 13 that performs image processing and the like.
  • the projection device 11 and the imaging device 12 constitute a projection unit and an imaging unit in the claims, respectively.
  • Projection device 11 is constituted by, for example, a projector.
  • the projection device 11 projects a predetermined pattern light onto the measurement target 20 under the control of the projection device control unit 131 of the processing device 13 described later.
  • the predetermined pattern light will be described later.
  • the imaging device 12 is configured with, for example, a camera.
  • the imaging device 12 images the measurement target 20 onto which a predetermined pattern light is projected.
  • the imaging device 12 outputs the image data (captured image) of the measured measurement target 20 to the image capturing unit 132 of the processing device 13 described later.
  • the projection apparatus 11 and the imaging apparatus 12 fixed at predetermined positions are used. That is, the projection device 11 is arranged so that the pattern light can be projected at a predetermined position.
  • the photographing device 12 is arranged so that a certain range including the position where the pattern light is projected by the projection device 11 can be photographed. It is assumed that the projection device 11 and the photographing device 12 have previously obtained internal and external parameters and secured time synchronization.
  • the processing device 13 includes a projection device control unit 131 that controls the projection device 11, an image capture unit 132 that captures image data from the imaging device 12, and a computation unit that performs computations necessary to measure the three-dimensional shape of the measurement target 20.
  • the calculation unit 133 includes a code extraction unit 133a, a code identification unit 133b, a corresponding point calculation unit 133c, and a three-dimensional position calculation unit 133d.
  • the code extracting unit 133a and the code identifying unit 133b constitute an extracting unit and an identifying unit in the claims, respectively, and the corresponding point calculating unit 133c and the three-dimensional position calculating unit 133d are calculated in the claims. Parts.
  • Projection device control unit 131 controls projection device 11.
  • the projection device control unit 131 controls the projection device 11 in response to an instruction from the control unit of the device main body on which the three-dimensional shape measurement device 10 is mounted.
  • the projection device 11 is controlled in accordance with an instruction from an electronic control unit (ECU) that controls the entire vehicle.
  • ECU electronice control unit
  • Examples of the control target in the projection device control unit 131 include the type of pattern light projected on the measurement target 20 and the projection timing thereof.
  • the image capturing unit 132 captures image data (captured image) of the measurement target 20 input from the imaging device 12. In addition, the image capturing unit 132 outputs the captured image data to the arithmetic unit 133. Specifically, image data obtained by photographing the measurement target 20 onto which a predetermined pattern light is projected is captured, and the image data is output to the calculation unit 133. The calculation unit 133 performs a predetermined calculation process on the image data (captured image of the measurement target 20 onto which a predetermined pattern light is projected) input from the image capturing unit 132, thereby obtaining a three-dimensional shape of the measurement target 20. Perform necessary calculations to measure.
  • FIG. 2 is a diagram illustrating an example of pattern light used in the three-dimensional shape measurement apparatus 10 according to the present embodiment. Note that FIG. 2 shows a case where the pattern light is composed of visible light for convenience of explanation. In the pattern light shown in FIG. 2, the light projection position by the projection device 11 is shown in black, and the light non-projection position is shown in white. The same applies to the following drawings showing pattern light.
  • the pattern light used in the three-dimensional shape measurement apparatus 10 preferably has a wavelength other than visible light.
  • an invisible wavelength band such as infrared light for the pattern light, it is possible to perform measurement of a three-dimensional shape without making the person to be measured 20 or the person in the vicinity thereof aware of it.
  • the pattern light shown in FIG. 2 includes a light / dark code in which the presence / absence of light projected at a predetermined position is coded. Specifically, in the pattern light shown in FIG. 2, a plurality of light and dark codes are arranged. These light and dark codes are arranged according to a predetermined rule. As will be described in detail later, the light and dark codes constituting the pattern light share a part with the adjacent light and dark codes and are configured continuously.
  • FIG. 3 is an explanatory diagram of the light / dark code 30 constituting the pattern light used in the three-dimensional shape measuring apparatus 10 according to the present embodiment.
  • FIG. 3A shows one light / dark code 30 constituting the pattern light used in the three-dimensional shape measuring apparatus 10 according to the present embodiment
  • FIG. 3B shows a basic pattern of the light / dark code 30.
  • FIG. 3 also shows a central element 31 of adjacent light and dark codes 30.
  • each light / dark cord 30 includes a central element 31 and a plurality (four in this embodiment) of arms 32 (32a to 32d) arranged around the central element 31. It consists of.
  • the central element 31 has a circular shape
  • the arm 32 has a rectangular shape.
  • the shapes of the central element 31 and the arm 32 are not limited to these and can be changed as appropriate. In the following description, it is assumed that the center element 31 has a circular shape and the arm 32 has a rectangular shape.
  • the arms 32a to 32d are continuously arranged at predetermined positions on the outer edge of the central element 31 having a circular shape.
  • the arms 32a to 32d are arranged at equal intervals (specifically, 90 ° intervals) from the outer edge of the central element 31.
  • the arm 32a extends from the center point of the center element 31 obliquely upward to the right as shown in FIG. 3A with reference to a horizontal plane passing through the center point of the center element 31, and the arm 32b is a horizontal plane passing through the center point of the center element 31. Is defined as a diagonally lower right portion shown in FIG. 3A from the center point of the center element 31.
  • the arm 32c extends obliquely downward to the left as shown in FIG.
  • the tips of the arms 32a to 32d are connected to the central element 31 of the adjacent light / dark cord 30.
  • the arm 32a is connected to the central element 31 of the light / dark cord 30 located diagonally upward to the right shown in FIG. 3A
  • the arm 32b is connected to the central element 31 of the light / dark cord 30 located diagonally downward to the right shown in FIG. 3A.
  • the arm 32c is connected to the central element 31 of the light / dark cord 30 located diagonally downward to the left shown in FIG. 3A
  • the arm 32d is connected to the central element 31 of the light / dark cord 30 located diagonally upward to the left shown in FIG. 3A.
  • the central element 31 of the adjacent light and dark cords 30 is indicated as “central element 31 ′”.
  • the light / dark code 30 is a code (4-bit code) that can express 16 types of information depending on the presence / absence of the four arms 32a to 32d connected to the central element 31. (See FIG. 3B).
  • the light / dark code 30a in which all the four arms 32a to 32d exist is specified by the bit information of “1111”.
  • the light / dark code 30b in which all of the four arms 32a to 32d do not exist is specified by bit information “0000”, and the light / dark code 30c in which only the arms 32a and 32c exist is specified by bit information “0101”.
  • such a light / dark code 30 is continuously arranged in the X-axis and Y-axis directions shown in FIG.
  • the light / dark code 301 shown in FIG. 2 is continuous with the light / dark codes 302 to 305 arranged in the periphery.
  • the light / dark code 302 is disposed on the upper right side of the light / dark code 301
  • the light / dark code 303 is disposed on the lower right side of the light / dark code 301.
  • the light / dark code 304 is disposed on the lower left side of the light / dark code 301
  • the light / dark code 305 is disposed on the upper left side of the light / dark code 301.
  • the arm 32a of the light / dark cord 301 constitutes an arm 32c of the light / dark cord 302
  • the arm 32b of the light / dark cord 301 constitutes an arm 32d of the light / dark cord 303.
  • the arm 32c of the light / dark cord 301 constitutes an arm 32a of the light / dark cord 304
  • the arm 32d of the light / dark cord 301 constitutes an arm 32b of the light / dark cord 305. That is, the arms 32a to 32d of the light / dark cord 301 are shared with a part of the arms 32 of the light / dark cords 302 to 305 arranged in the periphery.
  • the density of the light / dark code 30 arranged in the pattern light can be increased.
  • a predetermined number (15 in the present embodiment) of a plurality of light and dark codes 20 are arranged in the X-axis direction shown in FIG.
  • the code group 300 is configured in a pattern in which the code group 300 is repeatedly arranged in the X-axis direction. In the code group 300 shown in FIG. 2, 15 different light and dark codes 30 are arranged in a predetermined order.
  • the light / dark code 30 specified by the bit information “1001” is arranged on the rightmost side. Then, toward the left side, bit information “0100”, “1100”, “1101”, “0111”, “1101”, “1000”, “0110”, “0010”, “0001”, “1111” , “1110”, “0101”, “0011”, and the light / dark code 30 specified by the bit information “1010” are arranged on the leftmost side.
  • the code group 300 has a width that can be distinguished from the same code in the adjacent code group 300 when the code moves in the X-axis direction after projection (a discriminable width that can distinguish the parallax deviation between the projection pattern and the imaging pattern). Constitute. That is, the parallax deviation after the projection of the pattern light can be specified within the range of the width of the code group 300 (discriminable width).
  • a pattern in which such code groups 300 are repeatedly arranged in the X-axis direction is repeatedly arranged in the Y-axis direction. That is, the pattern light shown in FIG. 2 is configured by pattern light in which the code group 300 is repeatedly arranged in both the X-axis and Y-axis directions. In other words, the pattern light shown in FIG. 2 is encoded only in a certain range in the X-axis direction (a range corresponding to 15 light / dark codes 30), and is not encoded in the Y-axis direction. In the three-dimensional shape measurement apparatus 10, the pattern light shown in FIG. 2 is projected from the projection apparatus 11 onto the measurement object 20, and the measurement object 20 onto which the pattern light is projected is imaged by the imaging apparatus 12.
  • FIG. 4 is a schematic diagram for explaining the relationship between the light / dark code 30 and the number of pixels used in the three-dimensional shape measuring apparatus 10 according to the present embodiment.
  • the light / dark code 30 according to the present embodiment can be represented by a total of 36 pixels, for example, 6 pixels arranged in the vertical direction and 6 pixels arranged in the horizontal direction. it can.
  • the light / dark code 30 according to the present embodiment since the type is detected based on the light and darkness of these pixels, the light / dark code 30 is used even in a situation where the outline is not clearly identified due to the small number of pixels. Can be detected appropriately. As a result, the number of codes that can be detected per unit area by the image sensor can be increased, and the spatial resolution can be improved.
  • the code when detecting a pattern in which a graphic code is arranged in a two-dimensional series, the code is identified by applying a graphic code (for example, a graphic code having an elliptical shape as in Patent Document 1 described in Background Art). Therefore, for example, it is necessary to appropriately represent a curve or the like included in the graphic code. For this reason, there is a certain limitation on the reduction of the number of pixels representing the graphic code. For this reason, the number of figure codes that can be detected per unit area by the image sensor cannot be increased, and it is difficult to improve the spatial resolution.
  • a graphic code for example, a graphic code having an elliptical shape as in Patent Document 1 described in Background Art
  • the calculation unit 133 includes the code extraction unit 133a, the code identification unit 133b, the corresponding point calculation unit 133c, and the three-dimensional position calculation unit 133d, and is necessary for measuring the three-dimensional shape of the measurement target 20. Perform the operation.
  • the code extraction unit 133a extracts the central element 31 of the light / dark code 30 from the image data input from the image capturing unit 132 (a captured image of the measurement target 20 onto which a predetermined pattern light is projected). For example, the code extraction unit 133a extracts center point information (center coordinate information) of the center element 31 of the light / dark code 30 based on luminance information in the captured image. More specifically, the code extraction unit 133a extracts center point information (center coordinate information) of the center element 31 of the light / dark code 30 from the maximum luminance point in the captured image. For example, the code extraction unit 133a acquires the position information of the corresponding light / dark code 30 from the central point information of the central element 31 of the extracted light / dark code 30. The code extraction unit 133a outputs the acquired position information of the light / dark code 30 to the code identification unit 133b.
  • the code identifying unit 133b identifies the type of the light / dark code 30 based on the position information of the light / dark code 30 input from the code extracting unit 133a from the image data input from the image capturing unit 132. Specifically, the type of the light / dark code 30 is identified by the presence / absence of the arms 32a to 32d arranged around the central element 31. Moreover, the code identification part 133b specifies the position in the code group 300 of the light / dark code 30 to be identified. For example, the code identifying unit 133b identifies the position of the light / dark code 30 to be identified by comparison with the arrangement of the light / dark code 30 of the code group 300 held in advance.
  • the code identifying unit 133b identifies the light / dark code 30 to be identified by the bit information “1111” shown in FIG. 3A. 30a is identified. In this case, the code identifying unit 133b identifies the position of the light / dark code 30 to be identified as the fifth position from the left in the code group 300 illustrated in FIG. The code identification unit 133b outputs the identified type of the light / dark code 30 and the position in the identified code group 300 to the corresponding point calculation unit 133c.
  • the corresponding point calculation unit 133c calculates the corresponding point of the light / dark code 30 to be calculated in the projection image projected by the projection device 131 based on the type of the light / dark code 30 input from the code identification unit 133b and the position in the code group 300. To do. Specifically, the corresponding point of the light / dark code 30 to be calculated in the projection image is calculated by comparing the position of the corresponding light / dark code 30 in the code group 300 in the projection image with the position of the light / dark code 30 from the code identification unit 133b. To do. The corresponding point calculation unit 133c outputs information about the calculated corresponding point (hereinafter referred to as “corresponding point information”) to the three-dimensional position calculation unit 133d.
  • corresponding point information information about the calculated corresponding point
  • the three-dimensional position calculation unit 133d calculates a three-dimensional position corresponding to the light / dark code 30 to be calculated based on the corresponding point information input from the corresponding point calculation unit 133c. Specifically, the three-dimensional position calculation unit 133d acquires the depth coordinate from the positional deviation before and after the projection (deviation due to parallax) in the central element 31 of the corresponding light / dark code 30, and the brightness / darkness to be calculated from the depth coordinate. A three-dimensional position corresponding to the code 30 is calculated. The three-dimensional position calculation unit 133d outputs the calculated three-dimensional position to the control unit of the apparatus main body on which the three-dimensional shape measurement apparatus 10 is mounted. It is possible to measure the three-dimensional shape of the measurement target 20 by calculating the three-dimensional positions for all the light and dark codes 30 projected on the measurement target 20.
  • FIG. 5 is an explanatory diagram of depth information regarding an arbitrary point P of the measurement target 20 in the three-dimensional shape measurement apparatus 10 according to the present embodiment.
  • FIG. 5A the positional relationship between the measurement target 20 and the projection device 11 and the imaging device 12 in the three-dimensional shape measurement device 10 according to the present embodiment is shown.
  • FIG. 5B schematically shows a light receiving surface of a CCD (Charge Coupled Device) image sensor 121 included in the photographing apparatus 12.
  • CCD Charge Coupled Device
  • the imaging device 12 is disposed at a position on the opposite side across the projection position of the pattern light with respect to the measurement target 20 by the projection device 11.
  • the X axis (see FIG. 2) of the pattern light projected by the projection device 11 is the three points of the principal point C1 of the projection device 11, the principal point C2 of the photographing device 12, and an arbitrary point P on the measurement target 20. It is arranged in parallel with the epipolar surface EP formed by
  • the photographing apparatus 12 includes a CCD image sensor 121 in which imaging elements are arranged in parallel with the X-axis and Y-axis directions of the pattern light shown in FIG.
  • the X axis of the CCD image sensor 121 shown in FIG. 5B is arranged in parallel with the X axis of the pattern light shown in FIG. 2, and the Y axis of the image sensor shown in FIG. 5B is parallel to the Y axis of the pattern light shown in FIG. Is arranged.
  • the imaging device on the X axis of the imaging device 12 includes an epipolar plane EP formed by three points: a principal point C1 of the projection device 11, a principal point C2 of the imaging device 12, and an arbitrary point P on the measurement target 20. They are arranged in parallel.
  • Such a deviation between the reflection value Xmin and the reflection value Xa is acquired as depth information at point P (depth distance from the virtual reference plane Zmin).
  • a line (epipolar line) EL connecting the reflection value Xmin and the reflection value Xa used when obtaining depth information regarding an arbitrary point P on the measurement target 20 is, as shown in FIG. It changes only in the X-axis direction without changing in the Y-axis direction, and forms a straight line parallel to the X-axis.
  • FIG. 6 is a flowchart for explaining the operation of the three-dimensional shape measuring apparatus 10 according to the present embodiment.
  • FIG. 6 shows the operation of the processing device 13 of the three-dimensional shape measuring apparatus 10.
  • FIG. 6 it is assumed that three-dimensional shape measurement is instructed from the control unit of the apparatus main body on which the three-dimensional shape measurement apparatus 10 according to the present embodiment is mounted.
  • the captured image captured by the image capturing unit 132 is output to the computing unit 133.
  • the code extraction unit 133a extracts the center point information (center coordinate information) of the center element 31 of the light / dark code 30 included in the captured image (ST602).
  • the code extraction unit 133a acquires the position information of the corresponding light / dark code 30 from the center point information of the central element 31 of the extracted light / dark code 30.
  • the position information of the light / dark code 30 extracted by the code extraction unit 133a is output to the code identification unit 133b.
  • the central element 31 of the light / dark code 30 is extracted by the code extraction unit 133a, and the position information of the light / dark code 30 is obtained from the central point information of the central element 31. get.
  • the position information of the light / dark code 30 is acquired based on the position of the central element 31, so that it is not necessary to recognize the shape of the entire light / dark code 30, and the processing for this can be omitted, so it is necessary for detecting and identifying the code It is possible to further suppress the increase in the calculation cost.
  • the code identifying unit 133b Upon receiving the position information of the light / dark code 30, the code identifying unit 133b identifies the type of the light / dark code 30 (ST603).
  • the code identification unit 133b identifies the type of the light / dark code 30 based on the presence / absence of the arms 32a to 32d arranged around the central element 31. In this case, the code identifying unit 133b identifies the position in the code group 300 of the light / dark code 30 to be identified.
  • the type of the light / dark code 30 and the position of the light / dark code 30 in the code group 300 are output to the corresponding point calculation unit 133c.
  • the corresponding point calculation unit 133c Upon receiving information such as the position of the light / dark code 30 in the code group 300, the corresponding point calculation unit 133c calculates the corresponding point of the light / dark code 30 to be calculated in the projection image projected by the projection device 131 (ST604). In this case, the corresponding point of the light / dark code 30 to be calculated in the projection image is calculated by comparing the position of the corresponding light / dark code 30 in the code group 300 in the projection image with the position of the light / dark code 30 from the code identification unit 133b. . Then, the corresponding point information calculated by the corresponding point calculating unit 133c is output to the three-dimensional position calculating unit 133d.
  • the three-dimensional position calculation unit 133d calculates a three-dimensional position corresponding to the light / dark code 30 to be calculated (ST605).
  • the three-dimensional position calculation unit 133d acquires the depth coordinate from the deviation between the reflection value from the virtual reference plane and the reflection value from the actual position in the center element 31 of the corresponding light / dark code 30, and calculates the calculation target from the depth coordinate.
  • a three-dimensional position corresponding to the light / dark code 30 is calculated.
  • the three-dimensional position calculated by the three-dimensional position calculation unit 133d is output to the control unit of the apparatus main body on which the three-dimensional shape measurement apparatus 10 is mounted.
  • the calculation unit 133 determines whether the processing has been completed for the entire captured image (that is, all the light / dark codes 30) (ST606). If the process has not been completed for the entire captured image, operation unit 133 returns the process to ST602 and repeats the processes of ST602 to ST606. On the other hand, when the process has been completed for the entire surface of the captured image, the arithmetic unit 133 ends the process.
  • a plurality of light / dark codes 30 including arms 32 arranged around the central element 31 are arranged, and the type of the light / dark code 30 is identified by the arrangement of the arms 32.
  • chord for a detection and identification can be reduced.
  • the number of codes (code density) that can be detected per unit area by the image sensor can be increased, and the spatial resolution can be improved.
  • the pattern light projected onto the measurement target 20 is a direction parallel to the epipolar plane of the plurality of light / dark codes 30 (X-axis direction shown in FIG. 2).
  • a code group 300 arranged in a predetermined number is used as a constituent unit, and the constituent unit 300 is repeatedly arranged.
  • the pattern light is encoded only in a direction parallel to the epipolar plane, two directions orthogonal to the epipolar plane (two-dimensional array pattern (two-dimensional random point cloud pattern)) ( Since the number of codes can be reduced as compared with the case of encoding in a direction parallel to and perpendicular to the epipolar plane, the number of bits necessary to represent all codes can be reduced.
  • the code when the code moves in the direction parallel to the epipolar plane (first direction) after projection, it can be distinguished from the same code in the adjacent code group (disparity between the parallax deviation between the projection pattern and the shooting pattern)
  • the discriminable width can be long with a small number of bits. As a result, a long measurement range in the depth direction can be secured, and an increase in calculation cost can be suppressed.
  • the pattern light projected onto the measurement target 20 has a common arm 32 disposed between the central elements 31 of the adjacent light and dark codes 30 and is continuous. It is connected to.
  • the arm 32 is shared between the adjacent light and dark cords 30, the density of the light and dark cords 30 arranged in the pattern light can be increased.
  • the number of codes that can be detected by the CCD image sensor 121 of the photographing apparatus 12 can be increased, and the spatial resolution can be further improved.
  • the light / dark cord 30 has the extending direction of the arms 32a to 32d viewed from the central element 31 in the arrangement direction of the light / dark cord 30 (X-axis direction shown in FIG. 2).
  • the arms 32a to 32d are arranged in an oblique direction.
  • the length of the arm 32 can be secured as compared with the case where the extending direction of the arm 32 is arranged in a direction parallel or perpendicular to the arrangement direction of the light and dark cords 30. This makes it possible to improve the discrimination of the position of the central element 31 in the light / dark code 30 and the presence / absence of the arms 32a to 32d.
  • this invention is not limited to the said embodiment, It can implement by changing variously.
  • the configuration and the like illustrated in the accompanying drawings are not limited to this, and can be appropriately changed within a range in which the effect of the present invention is exhibited.
  • various modifications can be made without departing from the scope of the object of the present invention.
  • each light / dark code 30 is constituted by a 4-bit code.
  • the configuration of each light / dark code 30 is not limited to this, and can be changed as appropriate.
  • the light / dark code 30 may be composed of a 3-bit code capable of expressing eight types of information or an 8-bit code capable of expressing 256 types of information according to the amount of information to be expressed.
  • the number of light / dark codes 30 constituting the code group 300 can be adjusted as appropriate according to the amount of information that the light / dark code 30 can express (that is, the distinguishable width can be adjusted).
  • the case where the light / dark code 30 included in the pattern light includes a plurality of arms 32 as peripheral elements has been described.
  • the configuration of the peripheral elements of the light / dark code 30 is not limited to this and can be appropriately changed.
  • the case where the light-and-dark code 30 contained in pattern light is provided with the circular center element 31 is demonstrated.
  • the configuration of the central element 31 of the light / dark cord 30 is not limited to this and can be changed as appropriate.
  • FIG. 7 is an explanatory diagram of a light / dark code 30 according to a modification of the present embodiment.
  • the light / dark cord 30 in a mode not connected to the central element 31 of the adjacent light / dark cord 30 is shown.
  • 7B and 7C show an aspect in which the central element constituting the light / dark cord 30 and the peripheral elements arranged around the central element have the same shape.
  • FIG. 7B shows a case where both the central element and the peripheral element have a circular shape
  • FIG. 7C shows a case where both the central element and the peripheral element have a square shape.
  • FIG. 7D shows a case where eight peripheral elements are arranged around the central element 31. That is, in FIG.
  • an 8-bit code light / dark code 30 is shown. Even when the form of the light / dark code 30 is modified as described above, the three-dimensional shape of the measurement object is measured by one pattern light projection and photographing operation while ensuring the spatial resolution, as in the above embodiment. It becomes possible to do.
  • a three-dimensional shape measurement method for measuring a three-dimensional shape at the moment by one imaging is described.
  • dynamic measurement (motion input) of a three-dimensional shape by measuring the three-dimensional shape at a predetermined time interval by such a three-dimensional shape measurement method.
  • the arrangement direction of the light / dark code 30 is parallel to the epipolar plane.
  • the arrangement direction of the light / dark code 30 is not limited to this and can be changed as appropriate.
  • the light / dark code 30 may be arranged along a direction slightly inclined with respect to the epipolar plane. Even in the case of such changes, the same effects as in the above embodiment can be obtained.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

[Problem] To measure the three-dimensional shape of an object to be measured using projection of a one-time pattern light and an imaging operation while maintaining spatial resolution. [Solution] A three-dimensional shape measurement device (10) in which: pattern light having a plurality of light-dark codes aligned therein is projected onto an object to be measured (20) from a projection device (11); an image of the object to be measured having the pattern light projected thereonto is captured by an imaging device (12); the light-dark codes are extracted by a processing device (13) from the captured image obtained by capturing an image of the object to be measured; the types of the extracted light-dark codes are identified; and a three-dimensional position is calculated from position information for each of the identified light-dark codes. The three-dimensional shape measurement device (10) is characterized in that the light-dark codes comprise a central element and one or more peripheral elements that are arranged at the periphery of the central element, and the types of the light-dark codes are identified from the arrangement of the peripheral elements.

Description

3次元形状計測方法及び3次元形状計測装置3D shape measuring method and 3D shape measuring apparatus
 本発明は、プロジェクタ等の投影装置からパターン光が投影された計測対象を、カメラ等の撮像装置で撮影し、画像処理により計測対象の3次元形状を計測するための3次元形状計測方法及び3次元形状計測装置に関する。 The present invention captures a measurement target onto which pattern light has been projected from a projection apparatus such as a projector with an imaging apparatus such as a camera, and measures a three-dimensional shape of the measurement target by image processing and 3 The present invention relates to a three-dimensional shape measuring apparatus.
 従来、1回のパターンの投影及び撮影により計測対象の3次元形状を計測する方法として、空間コード化法が知られている。例えば、この空間コード化法においては、円形状を有する白黒の図形コードを2次元系列に配列したパターンを計測対象に投影し、その計測対象を撮影することで3次元形状を復元する3次元形状計測方法が提案されている(例えば、特許文献1参照)。 Conventionally, a spatial coding method is known as a method for measuring a three-dimensional shape of a measurement target by projecting and photographing a pattern once. For example, in this spatial coding method, a three-dimensional shape for reconstructing a three-dimensional shape by projecting a pattern in which black and white graphic codes having a circular shape are arranged in a two-dimensional series onto a measurement target and photographing the measurement target A measurement method has been proposed (see, for example, Patent Document 1).
特開2011-237296号公報JP 2011-237296 A
 しかしながら、上述した特許文献1に記載の3次元形状計測方法においては、図形コードを2次元系列に配列したパターンを用いることから、検出及び識別のために1つのコードに必要となる画素数が多くなる。この結果、撮像素子にて単位面積当たりに検出できるコードの数を増やすことができず、空間分解能を向上することが困難であるという問題がある。 However, in the three-dimensional shape measurement method described in Patent Document 1 described above, since a pattern in which graphic codes are arranged in a two-dimensional series is used, a large number of pixels are required for one code for detection and identification. Become. As a result, there is a problem in that it is difficult to increase the number of codes that can be detected per unit area by the imaging device, and it is difficult to improve the spatial resolution.
 本発明は、かかる点に鑑みてなされたものであり、空間分解能を確保しつつ、1回のパターン光の投影及び撮影動作により計測対象の3次元形状を計測することができる3次元形状計測方法及び3次元形状計測装置を提供することを目的とする。 The present invention has been made in view of the above points, and a three-dimensional shape measurement method capable of measuring a three-dimensional shape of a measurement object by one projection and photographing operation of pattern light while ensuring spatial resolution. And it aims at providing a three-dimensional shape measuring device.
 本発明の3次元形状計測方法は、複数の明暗コードが配列されたパターン光を計測対象に投影する投影ステップと、前記パターン光が投影された前記計測対象を撮影する撮影ステップと、前記計測対象を撮影した撮影画像から前記明暗コードを抽出するコード抽出ステップと、抽出した前記明暗コードの種類を識別するコード識別ステップと、識別したそれぞれの前記明暗コードの位置情報より3次元位置を算出する3次元位置算出ステップと、を具備する3次元形状計測方法であって、前記明暗コードは、中心要素と前記中心要素の周りに配置される1つ又は複数の周辺要素からなり、前記周辺要素の配置により当該明暗コードの種類が識別されることを特徴とする。 The three-dimensional shape measurement method of the present invention includes a projecting step of projecting pattern light on which a plurality of light and dark codes are arranged onto a measurement object, a photographing step of photographing the measurement object onto which the pattern light is projected, and the measurement object A code extracting step for extracting the light / dark code from the photographed image, a code identifying step for identifying the type of the extracted light / dark code, and calculating a three-dimensional position from position information of each identified light / dark code 3 A three-dimensional shape measuring method comprising: a dimension position calculating step, wherein the light / dark code includes a central element and one or a plurality of peripheral elements arranged around the central element, and the arrangement of the peripheral elements Thus, the type of the light / dark code is identified.
 上記3次元形状計測方法によれば、1回のパターンの投影及び撮影により計測対象の3次元形状を計測する際に利用されるパターン光に、中心要素と中心要素の周りに配置される周辺要素とからなる複数の明暗コードが配列され、その明暗コードの種類が周辺要素の配置により識別される。このため、図形コードを用いる場合と比べて検出及び識別のために1つのコードに必要となる画素数を低減できる。これにより、撮像素子にて単位面積当たりに検出できるコードの数を増やすことができ、空間分解能を向上できる。この結果、空間分解能を確保しつつ、1回のパターン光の投影及び撮影動作により計測対象の3次元形状を計測することが可能となる。 According to the above three-dimensional shape measurement method, the central element and the peripheral elements arranged around the central element are used for the pattern light used when measuring the three-dimensional shape of the measurement target by projecting and photographing the pattern once. Are arranged, and the type of the light / dark code is identified by the arrangement of the peripheral elements. For this reason, compared with the case where a figure code is used, the number of pixels required for one code | cord | chord for a detection and identification can be reduced. Thereby, the number of codes that can be detected per unit area by the image sensor can be increased, and the spatial resolution can be improved. As a result, it is possible to measure the three-dimensional shape of the measurement object by one projection and photographing operation of pattern light while ensuring the spatial resolution.
例えば、上記3次元形状計測方法において、前記パターン光は、複数の前記明暗コードを第1の方向に所定の数だけ配列したコード群を構成単位として、前記構成単位が前記第1の方向に対して垂直な第2の方向に繰り返し配列されてなるパターンであることが好ましい。特に、前記パターン光は、複数の前記明暗コードをエピポーラ面に対して平行な第1の方向に所定の数だけ配列したコード群を構成単位として、前記構成単位が前記第1の方向及びエピポーラ面に対して垂直な第2の方向に繰り返し配列されてなるパターンで構成されることが好ましい。この場合には、パターン光が、エピポーラ面に対して平行な方向(第1の方向)のみにコード化されることから、2次元配列パターンのように、エピポーラ面に対して直交する2方向(エピポーラ面に対して平行な方向及び垂直な方向)にコード化される場合と比較して、コード数を少なくできるので、全てのコードを表すために必要なビット数を低減できる。これに伴い、投影後にコードがエピポーラ面に対して平行な方向(第1の方向)に動いた時に隣のコード群における同一のコードと区別できる幅(投影パターンと撮影パターンとの視差ずれを弁別できる弁別可能幅)を少ないビット数で長く取ることができる。これにより、奥行き方向の計測レンジを長く確保でき、演算コストの上昇を抑制することが可能となる。 For example, in the above three-dimensional shape measurement method, the pattern light may include a code group in which a predetermined number of the light and dark codes are arranged in the first direction as a structural unit, and the structural unit is in the first direction. It is preferable that the pattern is repeatedly arranged in a second direction perpendicular to the vertical direction. In particular, the pattern light includes a code group in which a predetermined number of the light and dark codes are arranged in a first direction parallel to the epipolar plane, and the structural unit includes the first direction and the epipolar plane. It is preferable that the pattern is configured by being repeatedly arranged in a second direction perpendicular to the first direction. In this case, since the pattern light is encoded only in the direction parallel to the epipolar plane (first direction), the two directions orthogonal to the epipolar plane (such as a two-dimensional array pattern) ( Since the number of codes can be reduced as compared with the case of encoding in a direction parallel to and perpendicular to the epipolar plane, the number of bits necessary to represent all codes can be reduced. Along with this, when the code moves in the direction parallel to the epipolar plane (first direction) after projection, it can be distinguished from the same code in the adjacent code group (disparity between the parallax deviation between the projection pattern and the shooting pattern) The discriminable width) can be long with a small number of bits. As a result, a long measurement range in the depth direction can be secured, and an increase in calculation cost can be suppressed.
 特に、上記3次元形状計測方法において、前記パターン光は、隣り合う前記明暗コードの前記中心要素間に配置される共通の前記周辺要素を有し、隣り合う前記明暗コードが連続的につながって配列されていることが好ましい。この場合には、隣り合う明暗コードにおいて周辺要素が共有されることから、パターン光に配列される明暗コードの密度を増やすことができる。これにより、撮像素子にて検出できるコードの数を増やすことができ、空間分解能を更に向上することが可能となる。 In particular, in the three-dimensional shape measurement method, the pattern light has the common peripheral element disposed between the central elements of the adjacent light and dark codes, and the adjacent light and dark codes are continuously connected. It is preferable that In this case, since neighboring elements are shared by adjacent light and dark codes, the density of the light and dark codes arranged in the pattern light can be increased. Thereby, the number of codes that can be detected by the image sensor can be increased, and the spatial resolution can be further improved.
 また、上記3次元形状計測方法においては、前記コード抽出ステップにおいて、前記明暗コードの前記中心要素を抽出し、当該中心要素の位置より前記明暗コードの位置情報を取得することが好ましい。この場合には、中心要素の位置により明暗コードの位置情報が取得されることから、明暗コード全体形状を認識する必要がなく、そのための処理を省略できるので、演算コストの上昇を更に抑制することが可能となる。 In the three-dimensional shape measurement method, it is preferable that in the code extraction step, the central element of the light / dark code is extracted and the position information of the light / dark code is acquired from the position of the central element. In this case, since the position information of the light / dark code is acquired based on the position of the central element, it is not necessary to recognize the entire shape of the light / dark code, and the processing for this can be omitted, thereby further suppressing an increase in calculation cost. Is possible.
 例えば、上記3次元形状計測方法において、前記明暗コードは、前記中心要素の周りに前記周辺要素が配置される位置を4箇所有する。この場合には、中心要素の周りに4つの周辺要素を配置できるので、これらの周辺要素の有無により1つの明暗コードで4ビットコードを実現することが可能となる。 For example, in the above three-dimensional shape measurement method, the light / dark code has four positions where the peripheral elements are arranged around the central element. In this case, since four peripheral elements can be arranged around the central element, it is possible to realize a 4-bit code with one light / dark code depending on the presence or absence of these peripheral elements.
 また、上記3次元形状計測方法において、前記明暗コードは、前記中心要素からみた前記周辺要素の方向が前記明暗コードの配列方向に対して斜め方向となるように前記周辺要素が配置されることが好ましい。この場合には、中心要素から見て、明暗コードの配列方向に対して斜め方向となるように周辺要素が配置されることから、周辺要素の方向が明暗コードの配列方向に対して平行又は垂直な方向に配置される場合と比べて周辺要素の長さを確保できる。これにより、明暗コードにおける周辺要素の有無の識別性を向上することが可能となる。 In the three-dimensional shape measurement method, the light and dark code may be arranged such that the peripheral element is oblique to the arrangement direction of the light and dark code when viewed from the central element. preferable. In this case, since the peripheral elements are arranged so as to be oblique to the light / dark code arrangement direction when viewed from the central element, the direction of the peripheral elements is parallel or perpendicular to the light / dark code arrangement direction. The length of the peripheral elements can be ensured as compared with the case where they are arranged in any direction. Thereby, it becomes possible to improve the discriminability of the presence / absence of peripheral elements in the light / dark code.
 例えば、上記3次元形状計測方法において、前記投影ステップで投影される前記パターン光は、可視光以外の波長を有する。このようにパターン光に赤外光等の不可視の波長帯を使用することにより、計測対象となる人物や、その周辺の人物に意識させることなく3次元形状の計測を行うことができる。これにより、計測対象となる人物等の行動(例えば、車両等の運転動作)を阻害することなく3次元形状の計測を行うことが可能となる。 For example, in the above three-dimensional shape measurement method, the pattern light projected in the projection step has a wavelength other than visible light. As described above, by using an invisible wavelength band such as infrared light for the pattern light, it is possible to perform measurement of a three-dimensional shape without making the person to be measured or the surrounding person aware of it. As a result, it is possible to measure a three-dimensional shape without hindering the action of a person or the like to be measured (for example, driving operation of a vehicle or the like).
 本発明の3次元形状計測装置は、複数の明暗コードが配列されたパターン光を計測対象に投影する投影部と、前記パターン光が投影された前記計測対象を撮影する撮影部と、前記計測対象を撮影した撮影画像から前記明暗コードを抽出する抽出部と、前記抽出部で抽出した前記明暗コードの種類を識別する識別部と、前記識別部で識別したそれぞれの前記明暗コードの位置情報より3次元位置を算出する算出部と、を具備する3次元形状計測装置であって、前記明暗コードは、中心要素と前記中心要素の周りに配置される1つ又は複数の周辺要素からなり、前記周辺要素の配置により当該明暗コードの種類が識別されることを特徴とする。 The three-dimensional shape measurement apparatus of the present invention includes a projection unit that projects pattern light on which a plurality of light and dark codes are arranged onto a measurement target, an imaging unit that captures the measurement target on which the pattern light is projected, and the measurement target. 3 based on the extraction unit for extracting the light / dark code from the photographed image, the identification unit for identifying the type of the light / dark code extracted by the extraction unit, and the position information of each light / dark code identified by the identification unit. A three-dimensional shape measuring apparatus comprising: a calculation unit that calculates a dimensional position, wherein the light / dark code includes a central element and one or a plurality of peripheral elements arranged around the central element; The type of the light / dark code is identified by the arrangement of the elements.
 上記3次元形状計測装置によれば、1回のパターンの投影及び撮影により計測対象の3次元形状を計測する際に利用されるパターン光に中心要素と中心要素の周りに配置される周辺要素とからなる複数の明暗コードが配列され、その明暗コードの種類が周辺要素の配置により識別される。このため、図形コードを用いる場合と比べて検出及び識別のために1つのコードに必要となる画素数を低減できる。これにより、撮像素子にて検出できるコードの数を増やすことができ、空間分解能を向上できる。この結果、空間分解能を確保しつつ、1回のパターン光の投影及び撮影動作により計測対象の3次元形状を計測することが可能となる。 According to the above three-dimensional shape measurement apparatus, the central element and the peripheral elements arranged around the central element are used for pattern light used when measuring the three-dimensional shape of the measurement target by projecting and photographing the pattern once. A plurality of light / dark codes consisting of are arranged, and the type of the light / dark code is identified by the arrangement of the peripheral elements. For this reason, compared with the case where a figure code is used, the number of pixels required for one code | cord | chord for a detection and identification can be reduced. Thereby, the number of codes that can be detected by the image sensor can be increased, and the spatial resolution can be improved. As a result, it is possible to measure the three-dimensional shape of the measurement object by one projection and photographing operation of pattern light while ensuring the spatial resolution.
 例えば、上記3次元形状計測装置において、前記投影部が投影する前記パターン光は、複数の前記明暗コードを第1の方向に所定の数だけ配列したコード群を構成単位として、前記構成単位が前記第1の方向に対して垂直な第2の方向に繰り返し配列されてなるパターンであることが好ましい。特に、前記投影部が投影する前記パターン光は、複数の前記明暗コードをエピポーラ面に対して平行な第1の方向に所定の数だけ配列したコード群を構成単位として、前記構成単位が前記第1の方向及びエピポーラ面に対して垂直な第2の方向に繰り返し配列されてなるパターンで構成されることが好ましい。この場合には、パターン光が、2次元配列パターンのように、エピポーラ面に対して直交する2方向(エピポーラ面に対して平行な方向及び垂直な方向)にコード化される場合と異なり、エピポーラ面に対して平行な方向(第1の方向)のみにコード化されることから、投影パターンと撮影パターンとの視差ずれを弁別できる弁別可能最大幅を少ないビット数で長く取ることができる。これにより、奥行き方向の計測レンジを長く確保でき、演算コストの上昇を抑制することが可能となる。 For example, in the three-dimensional shape measurement apparatus, the pattern light projected by the projection unit includes a code group in which a predetermined number of the plurality of bright and dark codes are arranged in a first direction as a structural unit, and the structural unit is the A pattern that is repeatedly arranged in a second direction perpendicular to the first direction is preferable. In particular, the pattern light projected by the projection unit includes a code group in which a predetermined number of light and dark codes are arranged in a first direction parallel to the epipolar plane, and the structural unit is the first light. It is preferable that the pattern is configured by being repeatedly arranged in the first direction and the second direction perpendicular to the epipolar plane. In this case, unlike the case where the pattern light is encoded in two directions orthogonal to the epipolar plane (a direction parallel to and perpendicular to the epipolar plane) as in a two-dimensional array pattern, Since the encoding is performed only in the direction parallel to the surface (first direction), the maximum discriminable width that can discriminate the parallax deviation between the projection pattern and the imaging pattern can be increased with a small number of bits. As a result, a long measurement range in the depth direction can be secured, and an increase in calculation cost can be suppressed.
 本発明によれば、空間分解能を確保しつつ、1回のパターン光の投影及び撮影動作により計測対象の3次元形状を計測することが可能となる。 According to the present invention, it is possible to measure the three-dimensional shape of the measurement object by one projection and photographing operation of the pattern light while ensuring the spatial resolution.
本実施の形態に係る3次元形状計測装置の構成を示すブロック図である。It is a block diagram which shows the structure of the three-dimensional shape measuring apparatus which concerns on this Embodiment. 本実施の形態に係る3次元形状計測装置で用いられるパターン光の一例を示す図である。It is a figure which shows an example of the pattern light used with the three-dimensional shape measuring apparatus which concerns on this Embodiment. 本実施の形態に係る3次元形状計測装置で用いられるパターン光を構成する明暗コードの説明図である。It is explanatory drawing of the light-and-dark code which comprises the pattern light used with the three-dimensional shape measuring apparatus which concerns on this Embodiment. 本実施の形態に係る3次元形状計測装置で用いられる明暗コードと画素数との関係の説明図である。It is explanatory drawing of the relationship between the light / dark code used with the three-dimensional shape measuring apparatus which concerns on this Embodiment, and the number of pixels. 本実施の形態に係る3次元形状計測装置における計測対象の任意の点Pに関する奥行き情報の説明図である。It is explanatory drawing of the depth information regarding the arbitrary points P of the measuring object in the three-dimensional shape measuring apparatus which concerns on this Embodiment. 本実施の形態に係る3次元形状計測装置の動作について説明するためのフロー図である。It is a flowchart for demonstrating operation | movement of the three-dimensional shape measuring apparatus which concerns on this Embodiment. 本実施の形態の変形例に係る明暗コードの説明図である。It is explanatory drawing of the light / dark code which concerns on the modification of this Embodiment.
 以下、本発明の実施の形態について添付図面を参照して詳細に説明する。本発明は、パターン光が投影された計測対象を撮影し、その撮影画像に基づいて3次元形状を計測する空間コード化法において、パターン光に中心要素と中心要素の周りに配置される周辺要素とからなる複数の明暗コードを配列すると共に、その明暗コードの種類を周辺要素の配置により識別することで、1つのコードに必要となる画素数を低減して空間分解能を確保するものである。 Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. The present invention relates to a peripheral element that is arranged around a central element and a central element in the pattern light in a spatial coding method of capturing a measurement target onto which the pattern light is projected and measuring a three-dimensional shape based on the captured image. Are arranged, and the type of the light / dark code is identified by the arrangement of peripheral elements, thereby reducing the number of pixels required for one code and ensuring spatial resolution.
 図1は、本発明の一実施の形態に係る3次元形状計測装置の構成を示すブロック図である。なお、図1に示す3次元形状計測装置においては、本発明を説明するために簡略化されている。本発明に係る3次元形状計測装置においては、図1に示す構成に限定されるものではなく、3次元形状を計測するために必要な構成(例えば、撮影した計測対象を確認するための表示装置等)を別途備えるようにしてもよい。本実施の形態に係る3次元形状計測装置は、例えば、自動車に搭載され、運転者の手や指の動作を検出することで所望の機器の操作を受け付けるジェスチャーモーション入力装置に適用される。 FIG. 1 is a block diagram showing a configuration of a three-dimensional shape measuring apparatus according to an embodiment of the present invention. Note that the three-dimensional shape measuring apparatus shown in FIG. 1 is simplified for explaining the present invention. The three-dimensional shape measuring apparatus according to the present invention is not limited to the structure shown in FIG. 1, but is a structure necessary for measuring a three-dimensional shape (for example, a display device for confirming a photographed measurement object) Etc.) may be provided separately. The three-dimensional shape measurement apparatus according to the present embodiment is applied to, for example, a gesture motion input apparatus that is mounted on an automobile and receives an operation of a desired device by detecting the movement of a driver's hand or finger.
 図1に示すように、3次元形状計測装置10は、計測対象20にパターン光を投影する投影装置11と、計測対象20を撮影する撮影装置12と、撮影装置12で撮影された撮影画像に対する画像処理等を行う処理装置13とを含んで構成されている。なお、図1に示す3次元形状計測装置10において、投影装置11及び撮影装置12は、それぞれ特許請求の範囲における投影部及び撮影部を構成する。 As illustrated in FIG. 1, the three-dimensional shape measurement apparatus 10 is configured to project a projection apparatus 11 that projects pattern light onto a measurement object 20, an imaging apparatus 12 that captures the measurement object 20, and a captured image captured by the imaging apparatus 12. And a processing device 13 that performs image processing and the like. In the three-dimensional shape measuring apparatus 10 shown in FIG. 1, the projection device 11 and the imaging device 12 constitute a projection unit and an imaging unit in the claims, respectively.
 投影装置11は、例えば、プロジェクタ等で構成される。投影装置11は、後述する処理装置13の投影装置制御部131の制御の下、所定のパターン光を計測対象20に投影する。なお、この所定のパターン光については後述する。撮影装置12は、例えば、カメラ等で構成される。撮影装置12は、所定のパターン光が投影された計測対象20を撮影する。撮影装置12は、撮影した計測対象20の画像データ(撮影画像)を、後述する処理装置13の画像取込部132に出力する。 Projection device 11 is constituted by, for example, a projector. The projection device 11 projects a predetermined pattern light onto the measurement target 20 under the control of the projection device control unit 131 of the processing device 13 described later. The predetermined pattern light will be described later. The imaging device 12 is configured with, for example, a camera. The imaging device 12 images the measurement target 20 onto which a predetermined pattern light is projected. The imaging device 12 outputs the image data (captured image) of the measured measurement target 20 to the image capturing unit 132 of the processing device 13 described later.
 なお、本実施の形態に係る3次元形状計測装置10においては、所定位置に固定された投影装置11及び撮影装置12を用いる。すなわち、投影装置11は、予め定められた位置にパターン光を投影できるように配置されている。また、撮影装置12は、投影装置11によりパターン光が投影される位置を含む一定範囲を撮影できるように配置されている。なお、投影装置11と撮影装置12とは、予め両者間の内部・外部パラメータが求められており、時間同期が確保されているものとする。 In the three-dimensional shape measurement apparatus 10 according to the present embodiment, the projection apparatus 11 and the imaging apparatus 12 fixed at predetermined positions are used. That is, the projection device 11 is arranged so that the pattern light can be projected at a predetermined position. The photographing device 12 is arranged so that a certain range including the position where the pattern light is projected by the projection device 11 can be photographed. It is assumed that the projection device 11 and the photographing device 12 have previously obtained internal and external parameters and secured time synchronization.
 処理装置13は、投影装置11を制御する投影装置制御部131、撮影装置12から画像データを取り込む画像取込部132及び計測対象20の3次元形状を計測するために必要な演算を行う演算部133を有する。演算部133は、コード抽出部133a、コード識別部133b、対応点算出部133c及び3次元位置算出部133dを有する。なお、これらのコード抽出部133a及びコード識別部133bは、それぞれ特許請求の範囲における抽出部及び識別部を構成し、対応点算出部133c及び3次元位置算出部133dは、特許請求の範囲における算出部を構成する。 The processing device 13 includes a projection device control unit 131 that controls the projection device 11, an image capture unit 132 that captures image data from the imaging device 12, and a computation unit that performs computations necessary to measure the three-dimensional shape of the measurement target 20. 133. The calculation unit 133 includes a code extraction unit 133a, a code identification unit 133b, a corresponding point calculation unit 133c, and a three-dimensional position calculation unit 133d. The code extracting unit 133a and the code identifying unit 133b constitute an extracting unit and an identifying unit in the claims, respectively, and the corresponding point calculating unit 133c and the three-dimensional position calculating unit 133d are calculated in the claims. Parts.
 投影装置制御部131は、投影装置11を制御する。例えば、投影装置制御部131は、3次元形状計測装置10が搭載される装置本体の制御部からの指示に応じて投影装置11を制御する。3次元形状計測装置10が車両運転者のジェスチャーモーション入力装置に適用される場合、車両全体の制御を行う電子制御ユニット(ECU:Electronic Control Unit)からの指示に応じて投影装置11を制御する。投影装置制御部131における制御対象としては、例えば、計測対象20に投影するパターン光の種類や、その投影タイミング等が挙げられる。 Projection device control unit 131 controls projection device 11. For example, the projection device control unit 131 controls the projection device 11 in response to an instruction from the control unit of the device main body on which the three-dimensional shape measurement device 10 is mounted. When the three-dimensional shape measurement device 10 is applied to a gesture motion input device for a vehicle driver, the projection device 11 is controlled in accordance with an instruction from an electronic control unit (ECU) that controls the entire vehicle. Examples of the control target in the projection device control unit 131 include the type of pattern light projected on the measurement target 20 and the projection timing thereof.
画像取込部132は、撮影装置12から入力される計測対象20の画像データ(撮影画像)を取り込む。また、画像取込部132は、取り込んだ画像データを演算部133に出力する。具体的には、所定のパターン光が投影された計測対象20を撮影した画像データを取り込み、その画像データを演算部133に出力する。演算部133は、画像取込部132から入力された画像データ(所定のパターン光が投影された計測対象20の撮像画像)に対して所定の演算処理を施して計測対象20の3次元形状を計測するために必要な演算を行う。 The image capturing unit 132 captures image data (captured image) of the measurement target 20 input from the imaging device 12. In addition, the image capturing unit 132 outputs the captured image data to the arithmetic unit 133. Specifically, image data obtained by photographing the measurement target 20 onto which a predetermined pattern light is projected is captured, and the image data is output to the calculation unit 133. The calculation unit 133 performs a predetermined calculation process on the image data (captured image of the measurement target 20 onto which a predetermined pattern light is projected) input from the image capturing unit 132, thereby obtaining a three-dimensional shape of the measurement target 20. Perform necessary calculations to measure.
 ここで、本実施の形態に係る3次元形状計測装置10で用いられるパターン光について説明する。図2は、本実施の形態に係る3次元形状計測装置10で用いられるパターン光の一例を示す図である。なお、図2においては、説明の便宜上、パターン光が可視光で構成される場合について示している。また、図2に示すパターン光においては、投影装置11による光の投影位置を黒色で示し、光の非投影位置を白色で示している。パターン光を示す以下の図面においても同様である。 Here, the pattern light used in the three-dimensional shape measuring apparatus 10 according to the present embodiment will be described. FIG. 2 is a diagram illustrating an example of pattern light used in the three-dimensional shape measurement apparatus 10 according to the present embodiment. Note that FIG. 2 shows a case where the pattern light is composed of visible light for convenience of explanation. In the pattern light shown in FIG. 2, the light projection position by the projection device 11 is shown in black, and the light non-projection position is shown in white. The same applies to the following drawings showing pattern light.
 なお、本実施の形態に係る3次元形状計測装置10で用いられるパターン光は、可視光以外の波長を有することが好ましい。このようにパターン光に赤外光等の不可視の波長帯を使用することにより、計測対象20となる人物や、その周辺の人物に意識させることなく3次元形状の計測を行うことができる。これにより、計測対象20となる人物等の行動(例えば、車両等の運転動作)を阻害することなく3次元形状の計測を行うことが可能となる。 Note that the pattern light used in the three-dimensional shape measurement apparatus 10 according to the present embodiment preferably has a wavelength other than visible light. As described above, by using an invisible wavelength band such as infrared light for the pattern light, it is possible to perform measurement of a three-dimensional shape without making the person to be measured 20 or the person in the vicinity thereof aware of it. As a result, it is possible to measure a three-dimensional shape without hindering the action of a person or the like to be measured 20 (for example, driving operation of a vehicle or the like).
 図2に示すパターン光においては、予め定められた位置に投影される光の有無をコード化した明暗コードを含んでいる。具体的にいうと、図2に示すパターン光においては、複数の明暗コードが配列されている。これらの明暗コードは、予め定められた規則に従って配列されている。詳細について後述するように、パターン光を構成する明暗コードは、その一部を隣り合う明暗コードと共有し、互いに連続して構成されている。 The pattern light shown in FIG. 2 includes a light / dark code in which the presence / absence of light projected at a predetermined position is coded. Specifically, in the pattern light shown in FIG. 2, a plurality of light and dark codes are arranged. These light and dark codes are arranged according to a predetermined rule. As will be described in detail later, the light and dark codes constituting the pattern light share a part with the adjacent light and dark codes and are configured continuously.
 図3は、本実施の形態に係る3次元形状計測装置10で用いられるパターン光を構成する明暗コード30の説明図である。図3Aにおいては、本実施の形態に係る3次元形状計測装置10で用いられるパターン光を構成する1つの明暗コード30を示し、図3Bにおいては、明暗コード30の基本パターンを示している。なお、説明の便宜上、図3においては、隣り合う明暗コード30の中心要素31も示している。 FIG. 3 is an explanatory diagram of the light / dark code 30 constituting the pattern light used in the three-dimensional shape measuring apparatus 10 according to the present embodiment. FIG. 3A shows one light / dark code 30 constituting the pattern light used in the three-dimensional shape measuring apparatus 10 according to the present embodiment, and FIG. 3B shows a basic pattern of the light / dark code 30. For convenience of explanation, FIG. 3 also shows a central element 31 of adjacent light and dark codes 30.
 図3Aの破線A内に示すように、各明暗コード30は、中心要素31と、この中心要素31の周りに配置される複数(本実施の形態では4つ)のアーム32(32a~32d)とから構成される。例えば、中心要素31は円形状を有し、アーム32は矩形状を有する。しかしながら、中心要素31及びアーム32の形状については、これらに限定されるものではなく適宜変更が可能である。以下においては、中心要素31が円形状を有し、アーム32が矩形状を有するものとして説明する。 As shown in a broken line A in FIG. 3A, each light / dark cord 30 includes a central element 31 and a plurality (four in this embodiment) of arms 32 (32a to 32d) arranged around the central element 31. It consists of. For example, the central element 31 has a circular shape, and the arm 32 has a rectangular shape. However, the shapes of the central element 31 and the arm 32 are not limited to these and can be changed as appropriate. In the following description, it is assumed that the center element 31 has a circular shape and the arm 32 has a rectangular shape.
 アーム32a~32dは、円形状を有する中心要素31の外縁部の所定位置に連続して配置される。アーム32a~32dは、中心要素31の外縁部から等間隔(具体的には、90°間隔)に配置されている。アーム32aは、中心要素31の中心点を通過する水平面を基準として、中心要素31の中心点から図3Aに示す右斜め上方に延出し、アーム32bは、中心要素31の中心点を通過する水平面を基準として、中心要素31の中心点から図3Aに示す右斜め下方に延出する。同様に、アーム32cは、中心要素31の中心点を通過する水平面を基準として、中心要素31の中心点から図3Aに示す左斜め下方に延出し、アーム32dは、中心要素31の中心点を通過する水平面を基準として、中心要素31の中心点から図3Aに示す左斜め上方に延出する。なお、詳細について後述するように、中心要素31の中心点を通過する水平面は、後述するエピポーラ面EPと平行に配置される(図4参照)。 The arms 32a to 32d are continuously arranged at predetermined positions on the outer edge of the central element 31 having a circular shape. The arms 32a to 32d are arranged at equal intervals (specifically, 90 ° intervals) from the outer edge of the central element 31. The arm 32a extends from the center point of the center element 31 obliquely upward to the right as shown in FIG. 3A with reference to a horizontal plane passing through the center point of the center element 31, and the arm 32b is a horizontal plane passing through the center point of the center element 31. Is defined as a diagonally lower right portion shown in FIG. 3A from the center point of the center element 31. Similarly, the arm 32c extends obliquely downward to the left as shown in FIG. 3A from the center point of the center element 31 with reference to a horizontal plane passing through the center point of the center element 31, and the arm 32d extends the center point of the center element 31. With reference to the passing horizontal plane, it extends from the center point of the center element 31 obliquely upward to the left shown in FIG. 3A. As will be described in detail later, the horizontal plane that passes through the center point of the central element 31 is arranged in parallel with the epipolar surface EP described later (see FIG. 4).
 各アーム32a~32dの先端は、隣り合う明暗コード30の中心要素31に接続されている。アーム32aは、図3Aに示す右斜め上方に位置する明暗コード30の中心要素31に接続され、アーム32bは、図3Aに示す右斜め下方に位置する明暗コード30の中心要素31に接続される。同様に、アーム32cは、図3Aに示す左斜め下方に位置する明暗コード30の中心要素31に接続され、アーム32dは、図3Aに示す左斜め上方に位置する明暗コード30の中心要素31に接続される。なお、図3Aにおいては、説明の便宜上、隣り合う明暗コード30の中心要素31を「中心要素31´」と示している。 The tips of the arms 32a to 32d are connected to the central element 31 of the adjacent light / dark cord 30. The arm 32a is connected to the central element 31 of the light / dark cord 30 located diagonally upward to the right shown in FIG. 3A, and the arm 32b is connected to the central element 31 of the light / dark cord 30 located diagonally downward to the right shown in FIG. 3A. . Similarly, the arm 32c is connected to the central element 31 of the light / dark cord 30 located diagonally downward to the left shown in FIG. 3A, and the arm 32d is connected to the central element 31 of the light / dark cord 30 located diagonally upward to the left shown in FIG. 3A. Connected. In FIG. 3A, for convenience of explanation, the central element 31 of the adjacent light and dark cords 30 is indicated as “central element 31 ′”.
 このように本実施の形態に係る3次元形状計測装置10において、明暗コード30は、中心要素31に連続する4つのアーム32a~32dの有無により16種類の情報を表現可能なコード(4ビットコード)で構成される(図3B参照)。例えば、アーム32a~32dをそれぞれ第1~第4のビット情報に関連付ける場合、図3Bに示すように、4つのアーム32a~32dの全てが存在する明暗コード30aは「1111」のビット情報で特定され、4つのアーム32a~32dの全てが存在しない明暗コード30bは「0000」のビット情報で特定され、アーム32a及びアーム32cのみが存在する明暗コード30cは「0101」のビット情報で特定される。 As described above, in the three-dimensional shape measuring apparatus 10 according to the present embodiment, the light / dark code 30 is a code (4-bit code) that can express 16 types of information depending on the presence / absence of the four arms 32a to 32d connected to the central element 31. (See FIG. 3B). For example, when associating the arms 32a to 32d with the first to fourth bit information, as shown in FIG. 3B, the light / dark code 30a in which all the four arms 32a to 32d exist is specified by the bit information of “1111”. The light / dark code 30b in which all of the four arms 32a to 32d do not exist is specified by bit information “0000”, and the light / dark code 30c in which only the arms 32a and 32c exist is specified by bit information “0101”. .
 本実施の形態に係る3次元形状計測装置10で用いられるパターン光においては、このような明暗コード30を図2に示すX軸及びY軸方向に連続して配列している。例えば、図2に示す明暗コード301においては、周辺に配置される明暗コード302~305に連続している。ここで、明暗コード302は明暗コード301の右上方側に配置され、明暗コード303は明暗コード301の右下方側に配置される。また、明暗コード304は明暗コード301の左下方側に配置され、明暗コード305は明暗コード301の左上方側に配置される。 In the pattern light used in the three-dimensional shape measuring apparatus 10 according to the present embodiment, such a light / dark code 30 is continuously arranged in the X-axis and Y-axis directions shown in FIG. For example, the light / dark code 301 shown in FIG. 2 is continuous with the light / dark codes 302 to 305 arranged in the periphery. Here, the light / dark code 302 is disposed on the upper right side of the light / dark code 301, and the light / dark code 303 is disposed on the lower right side of the light / dark code 301. The light / dark code 304 is disposed on the lower left side of the light / dark code 301, and the light / dark code 305 is disposed on the upper left side of the light / dark code 301.
 この場合において、明暗コード301のアーム32aは、明暗コード302のアーム32cを構成し、明暗コード301のアーム32bは、明暗コード303のアーム32dを構成する。また、明暗コード301のアーム32cは、明暗コード304のアーム32aを構成し、明暗コード301のアーム32dは、明暗コード305のアーム32bを構成する。すなわち、明暗コード301のアーム32a~32dは、周辺に配置される明暗コード302~305のアーム32の一部と共通化されている。このように明暗コード30のアーム32を隣り合う明暗コード30のアーム32と共有することにより、パターン光に配列される明暗コード30の密度を増やすことができる。 In this case, the arm 32a of the light / dark cord 301 constitutes an arm 32c of the light / dark cord 302, and the arm 32b of the light / dark cord 301 constitutes an arm 32d of the light / dark cord 303. Further, the arm 32c of the light / dark cord 301 constitutes an arm 32a of the light / dark cord 304, and the arm 32d of the light / dark cord 301 constitutes an arm 32b of the light / dark cord 305. That is, the arms 32a to 32d of the light / dark cord 301 are shared with a part of the arms 32 of the light / dark cords 302 to 305 arranged in the periphery. Thus, by sharing the arm 32 of the light / dark code 30 with the arm 32 of the adjacent light / dark code 30, the density of the light / dark code 30 arranged in the pattern light can be increased.
 また、本実施の形態に係る3次元形状計測装置10で用いられるパターン光においては、複数の明暗コード20を図2に示すX軸方向に所定の数(本実施の形態では15個)だけ配列したコード群300を構成単位として、このコード群300がX軸方向に繰り返し配列されるパターンで構成されている。図2に示すコード群300においては、予め定められた順番で異なる15個の明暗コード30が配列されている。 Further, in the pattern light used in the three-dimensional shape measuring apparatus 10 according to the present embodiment, a predetermined number (15 in the present embodiment) of a plurality of light and dark codes 20 are arranged in the X-axis direction shown in FIG. The code group 300 is configured in a pattern in which the code group 300 is repeatedly arranged in the X-axis direction. In the code group 300 shown in FIG. 2, 15 different light and dark codes 30 are arranged in a predetermined order.
 より具体的にいうと、図2に示すコード群300においては、最も右方側にビット情報「1001」で特定される明暗コード30が配置されている。そして、左方側に向かって、ビット情報「0100」、「1100」、「1101」、「0111」、「1101」、「1000」、「0110」、「0010」、「0001」、「1111」、「1110」、「0101」、「0011」で特定される明暗コード30が配置され、最も左方側にビット情報「1010」で特定される明暗コード30が配置されている。 More specifically, in the code group 300 shown in FIG. 2, the light / dark code 30 specified by the bit information “1001” is arranged on the rightmost side. Then, toward the left side, bit information “0100”, “1100”, “1101”, “0111”, “1101”, “1000”, “0110”, “0010”, “0001”, “1111” , “1110”, “0101”, “0011”, and the light / dark code 30 specified by the bit information “1010” are arranged on the leftmost side.
このように異なる15個の明暗コード30が配列されることから、コード群300の中で特定の明暗コード30を識別できるように構成されている。なお、このコード群300は、投影後にコードがX軸方向に動いた時に隣のコード群300における同一のコードと区別できる幅(投影パターンと撮影パターンとの視差ずれを弁別できる弁別可能幅)を構成する。すなわち、パターン光の投影後における視差ずれは、このコード群300の幅(弁別可能幅)の範囲内で特定できる。 Since the 15 different light / dark codes 30 are arranged in this manner, the specific light / dark code 30 can be identified in the code group 300. The code group 300 has a width that can be distinguished from the same code in the adjacent code group 300 when the code moves in the X-axis direction after projection (a discriminable width that can distinguish the parallax deviation between the projection pattern and the imaging pattern). Constitute. That is, the parallax deviation after the projection of the pattern light can be specified within the range of the width of the code group 300 (discriminable width).
 さらに、本実施の形態に係る3次元形状計測装置10で用いられるパターン光においては、このようなコード群300がX軸方向に繰り返し配列されたパターンがY軸方向に繰り返し配列されている。すなわち、図2に示すパターン光においては、コード群300がX軸及びY軸方向の双方に繰り返し配列されたパターン光で構成される。言い換えると、図2に示すパターン光においては、X軸方向の一定範囲(15個の明暗コード30に相当する範囲)でのみコード化されており、Y軸方向にはコード化されていない。3次元形状計測装置10においては、投影装置11から図2に示すパターン光が計測対象20に投影され、撮影装置12にてパターン光が投影された計測対象20が撮影される。 Furthermore, in the pattern light used in the three-dimensional shape measuring apparatus 10 according to the present embodiment, a pattern in which such code groups 300 are repeatedly arranged in the X-axis direction is repeatedly arranged in the Y-axis direction. That is, the pattern light shown in FIG. 2 is configured by pattern light in which the code group 300 is repeatedly arranged in both the X-axis and Y-axis directions. In other words, the pattern light shown in FIG. 2 is encoded only in a certain range in the X-axis direction (a range corresponding to 15 light / dark codes 30), and is not encoded in the Y-axis direction. In the three-dimensional shape measurement apparatus 10, the pattern light shown in FIG. 2 is projected from the projection apparatus 11 onto the measurement object 20, and the measurement object 20 onto which the pattern light is projected is imaged by the imaging apparatus 12.
 図4は、本実施の形態に係る3次元形状計測装置10で用いられる明暗コード30と画素数との関係を説明するための模式図である。図4の枠A内に示すように、本実施の形態に係る明暗コード30は、例えば、縦方向に並んだ6画素と、横方向に並んだ6画素の合計36個の画素で表すことができる。本実施の形態に係る明暗コード30においては、これらの画素における明暗によりその種別が検出されることから、画素数が少ないことに起因してその輪郭が明確に識別されない状況下においても明暗コード30の種別を適切に検出することができる。この結果、撮像素子にて単位面積当たりに検出できるコードの数を増やすことができ、空間分解能を向上することが可能となる。 FIG. 4 is a schematic diagram for explaining the relationship between the light / dark code 30 and the number of pixels used in the three-dimensional shape measuring apparatus 10 according to the present embodiment. As shown in frame A of FIG. 4, the light / dark code 30 according to the present embodiment can be represented by a total of 36 pixels, for example, 6 pixels arranged in the vertical direction and 6 pixels arranged in the horizontal direction. it can. In the light / dark code 30 according to the present embodiment, since the type is detected based on the light and darkness of these pixels, the light / dark code 30 is used even in a situation where the outline is not clearly identified due to the small number of pixels. Can be detected appropriately. As a result, the number of codes that can be detected per unit area by the image sensor can be increased, and the spatial resolution can be improved.
 これに対し、図形コードを2次元系列に配列したパターンを検出する場合には、図形コード(例えば、背景技術で説明した特許文献1のような楕円形状を有する図形コード)を当てはめてコード識別することから、例えば、図形コードに含まれる曲線等を適切に表す必要がある。このため、図形コードを表す画素数の低減には一定の制限が存在する。このため、撮像素子にて単位面積当たりに検出できる図形コードの数を増やすことができず、空間分解能を向上することが困難である。 On the other hand, when detecting a pattern in which a graphic code is arranged in a two-dimensional series, the code is identified by applying a graphic code (for example, a graphic code having an elliptical shape as in Patent Document 1 described in Background Art). Therefore, for example, it is necessary to appropriately represent a curve or the like included in the graphic code. For this reason, there is a certain limitation on the reduction of the number of pixels representing the graphic code. For this reason, the number of figure codes that can be detected per unit area by the image sensor cannot be increased, and it is difficult to improve the spatial resolution.
 図1に戻り、本実施の形態に係る3次元形状計測装置10の処理装置13が有する演算部133の構成について説明する。上述のように、演算部133は、コード抽出部133a、コード識別部133b、対応点算出部133c及び3次元位置算出部133dを有し、計測対象20の3次元形状を計測するために必要な演算を行う。 Referring back to FIG. 1, the configuration of the calculation unit 133 included in the processing device 13 of the three-dimensional shape measurement apparatus 10 according to the present embodiment will be described. As described above, the calculation unit 133 includes the code extraction unit 133a, the code identification unit 133b, the corresponding point calculation unit 133c, and the three-dimensional position calculation unit 133d, and is necessary for measuring the three-dimensional shape of the measurement target 20. Perform the operation.
 コード抽出部133aは、画像取込部132から入力される画像データ(所定のパターン光が投影された計測対象20の撮像画像)から明暗コード30の中心要素31を抽出する。例えば、コード抽出部133aは、撮像画像における輝度情報に基づいて明暗コード30の中心要素31の中心点情報(中心座標情報)を抽出する。より具体的には、コード抽出部133aは、撮像画像における輝度の最大点より明暗コード30の中心要素31の中心点情報(中心座標情報)を抽出する。例えばコード抽出部133aは、抽出した明暗コード30の中心要素31の中心点情報から、該当する明暗コード30の位置情報を取得する。コード抽出部133aは、取得した明暗コード30の位置情報をコード識別部133bに出力する。 The code extraction unit 133a extracts the central element 31 of the light / dark code 30 from the image data input from the image capturing unit 132 (a captured image of the measurement target 20 onto which a predetermined pattern light is projected). For example, the code extraction unit 133a extracts center point information (center coordinate information) of the center element 31 of the light / dark code 30 based on luminance information in the captured image. More specifically, the code extraction unit 133a extracts center point information (center coordinate information) of the center element 31 of the light / dark code 30 from the maximum luminance point in the captured image. For example, the code extraction unit 133a acquires the position information of the corresponding light / dark code 30 from the central point information of the central element 31 of the extracted light / dark code 30. The code extraction unit 133a outputs the acquired position information of the light / dark code 30 to the code identification unit 133b.
 コード識別部133bは、画像取込部132から入力される画像データから、コード抽出部133aから入力される明暗コード30の位置情報に基づいて、明暗コード30の種類を識別する。具体的には、中心要素31の周辺に配置されるアーム32a~32dの有無により明暗コード30の種類を識別する。また、コード識別部133bは、識別対象の明暗コード30のコード群300における位置を特定する。例えば、コード識別部133bは、予め保持したコード群300の明暗コード30の配列との比較により識別対象の明暗コード30の位置を特定する。例えば、中心要素31の周囲に4本のアーム32a~32dを検出する場合、コード識別部133bは、識別対象の明暗コード30を、図3Aに示す「1111」のビット情報で特定される明暗コード30aと識別する。この場合、コード識別部133bは、識別対象の明暗コード30の位置を、図2に示すコード群300における左側から5番目の位置と特定する。コード識別部133bは、識別した明暗コード30の種類及び特定したコード群300における位置を対応点算出部133cに出力する。 The code identifying unit 133b identifies the type of the light / dark code 30 based on the position information of the light / dark code 30 input from the code extracting unit 133a from the image data input from the image capturing unit 132. Specifically, the type of the light / dark code 30 is identified by the presence / absence of the arms 32a to 32d arranged around the central element 31. Moreover, the code identification part 133b specifies the position in the code group 300 of the light / dark code 30 to be identified. For example, the code identifying unit 133b identifies the position of the light / dark code 30 to be identified by comparison with the arrangement of the light / dark code 30 of the code group 300 held in advance. For example, when detecting four arms 32a to 32d around the central element 31, the code identifying unit 133b identifies the light / dark code 30 to be identified by the bit information “1111” shown in FIG. 3A. 30a is identified. In this case, the code identifying unit 133b identifies the position of the light / dark code 30 to be identified as the fifth position from the left in the code group 300 illustrated in FIG. The code identification unit 133b outputs the identified type of the light / dark code 30 and the position in the identified code group 300 to the corresponding point calculation unit 133c.
 対応点算出部133cは、コード識別部133bから入力される明暗コード30の種類及びコード群300における位置に基づいて、投影装置131が投影した投影画像における算出対象の明暗コード30の対応点を算出する。具体的には、投影画像におけるコード群300の該当する明暗コード30の位置と、コード識別部133bからの明暗コード30の位置との比較により投影画像における算出対象の明暗コード30の対応点を算出する。対応点算出部133cは、算出した対応点に関する情報(以下、「対応点情報」という)を3次元位置算出部133dに出力する。 The corresponding point calculation unit 133c calculates the corresponding point of the light / dark code 30 to be calculated in the projection image projected by the projection device 131 based on the type of the light / dark code 30 input from the code identification unit 133b and the position in the code group 300. To do. Specifically, the corresponding point of the light / dark code 30 to be calculated in the projection image is calculated by comparing the position of the corresponding light / dark code 30 in the code group 300 in the projection image with the position of the light / dark code 30 from the code identification unit 133b. To do. The corresponding point calculation unit 133c outputs information about the calculated corresponding point (hereinafter referred to as “corresponding point information”) to the three-dimensional position calculation unit 133d.
 3次元位置算出部133dは、対応点算出部133cから入力される対応点情報に基づいて、算出対象となる明暗コード30に対応する3次元位置を算出する。具体的には、3次元位置算出部133dは、対応する明暗コード30の中心要素31における投影前後での位置ずれ(視差によるずれ)から奥行き座標を取得し、その奥行き座標から算出対象となる明暗コード30に対応する3次元位置を算出する。3次元位置算出部133dは、算出した3次元位置を、3次元形状計測装置10が搭載される装置本体の制御部に出力する。計測対象20上に投影された全ての明暗コード30に対して3次元位置を算出することにより、計測対象20の3次元形状を計測することが可能となる。 The three-dimensional position calculation unit 133d calculates a three-dimensional position corresponding to the light / dark code 30 to be calculated based on the corresponding point information input from the corresponding point calculation unit 133c. Specifically, the three-dimensional position calculation unit 133d acquires the depth coordinate from the positional deviation before and after the projection (deviation due to parallax) in the central element 31 of the corresponding light / dark code 30, and the brightness / darkness to be calculated from the depth coordinate. A three-dimensional position corresponding to the code 30 is calculated. The three-dimensional position calculation unit 133d outputs the calculated three-dimensional position to the control unit of the apparatus main body on which the three-dimensional shape measurement apparatus 10 is mounted. It is possible to measure the three-dimensional shape of the measurement target 20 by calculating the three-dimensional positions for all the light and dark codes 30 projected on the measurement target 20.
 ここで、本実施の形態に係る3次元形状計測装置10における計測対象20の任意の点Pに関する奥行き情報について説明する。図5は、本実施の形態に係る3次元形状計測装置10における計測対象20の任意の点Pに関する奥行き情報の説明図である。図5Aにおいては、本実施の形態に係る3次元形状計測装置10における投影装置11及び撮影装置12と、計測対象20との位置関係を示している。図5Bにおいては、撮影装置12が有するCCD(Charge Coupled Device)イメージセンサ121の受光面を模式的に示している。なお、図4Aに示すf、fは、それぞれ投影装置11、撮影装置12(CCDイメージセンサ121)における焦点距離を示している。 Here, depth information regarding an arbitrary point P of the measurement target 20 in the three-dimensional shape measurement apparatus 10 according to the present embodiment will be described. FIG. 5 is an explanatory diagram of depth information regarding an arbitrary point P of the measurement target 20 in the three-dimensional shape measurement apparatus 10 according to the present embodiment. In FIG. 5A, the positional relationship between the measurement target 20 and the projection device 11 and the imaging device 12 in the three-dimensional shape measurement device 10 according to the present embodiment is shown. FIG. 5B schematically shows a light receiving surface of a CCD (Charge Coupled Device) image sensor 121 included in the photographing apparatus 12. Note that f 1 and f 2 shown in FIG. 4A indicate focal lengths in the projection device 11 and the imaging device 12 (CCD image sensor 121), respectively.
 図5Aに示すように、撮影装置12は、投影装置11による計測対象20に対するパターン光の投影位置を挟んで反対側の位置に配置されている。この場合において、投影装置11が投影するパターン光のX軸(図2参照)は、投影装置11の主点C1、撮影装置12の主点C2及び計測対象20上の任意の点Pの3点で形成されるエピポーラ面EPと平行に配置されている。 As shown in FIG. 5A, the imaging device 12 is disposed at a position on the opposite side across the projection position of the pattern light with respect to the measurement target 20 by the projection device 11. In this case, the X axis (see FIG. 2) of the pattern light projected by the projection device 11 is the three points of the principal point C1 of the projection device 11, the principal point C2 of the photographing device 12, and an arbitrary point P on the measurement target 20. It is arranged in parallel with the epipolar surface EP formed by
 一方、撮影装置12は、図2に示すパターン光のX軸及びY軸方向とそれぞれ平行に撮像素子が配列されたCCDイメージセンサ121を備えている。図5Bに示すCCDイメージセンサ121のX軸は、図2に示すパターン光のX軸と平行に配置され、図5Bに示すイメージセンサのY軸は、図2に示すパターン光のY軸と平行に配置されている。すなわち、撮影装置12のX軸上の撮像素子は、投影装置11の主点C1、撮影装置12の主点C2及び計測対象20上の任意の点Pの3点で形成されるエピポーラ面EPと平行に配列されている。 On the other hand, the photographing apparatus 12 includes a CCD image sensor 121 in which imaging elements are arranged in parallel with the X-axis and Y-axis directions of the pattern light shown in FIG. The X axis of the CCD image sensor 121 shown in FIG. 5B is arranged in parallel with the X axis of the pattern light shown in FIG. 2, and the Y axis of the image sensor shown in FIG. 5B is parallel to the Y axis of the pattern light shown in FIG. Is arranged. In other words, the imaging device on the X axis of the imaging device 12 includes an epipolar plane EP formed by three points: a principal point C1 of the projection device 11, a principal point C2 of the imaging device 12, and an arbitrary point P on the measurement target 20. They are arranged in parallel.
 このような構成を有する3次元形状計測装置10で図2に示すパターン光を投影する場合において、計測対象20上の任意の点Pに関する奥行き情報を求める場合について考える。例えば、計測対象20上の任意の点Pに対して図2に示すパターン光を投影した場合において、エピポーラ面EP上の仮想的な基準面(以下、「仮想基準面」という)Zminからの反射値は、撮影装置12のCCDイメージセンサ121において反射値Xminとして測定される。一方、撮像画像から検出される実際の位置Zaからの反射値は、撮影装置12のCCDイメージセンサ121において反射値Xaとして測定される。このような反射値Xminと反射値Xaとのずれが点Pにおける奥行き情報(仮想基準面Zminからの奥行き距離)として取得される。この場合、計測対象20上の任意の点Pに関する奥行き情報を求める際に利用される反射値Xminと反射値Xaとを結ぶ線(エピポーラ線)ELは、図5Bに示すように、撮影装置12のY軸方向に変化せず、X軸方向のみに変化し、X軸と平行な直線を構成する。 Consider the case of obtaining depth information regarding an arbitrary point P on the measurement target 20 when the pattern light shown in FIG. 2 is projected by the three-dimensional shape measurement apparatus 10 having such a configuration. For example, when the pattern light shown in FIG. 2 is projected onto an arbitrary point P on the measurement target 20, the reflection from a virtual reference plane (hereinafter referred to as “virtual reference plane”) Zmin on the epipolar plane EP. The value is measured as a reflection value Xmin in the CCD image sensor 121 of the photographing apparatus 12. On the other hand, the reflection value from the actual position Za detected from the captured image is measured as the reflection value Xa by the CCD image sensor 121 of the photographing apparatus 12. Such a deviation between the reflection value Xmin and the reflection value Xa is acquired as depth information at point P (depth distance from the virtual reference plane Zmin). In this case, a line (epipolar line) EL connecting the reflection value Xmin and the reflection value Xa used when obtaining depth information regarding an arbitrary point P on the measurement target 20 is, as shown in FIG. It changes only in the X-axis direction without changing in the Y-axis direction, and forms a straight line parallel to the X-axis.
 これは、仮想基準面(反射面)Zminでのパターン反射光(予想位置)に対して、実際に検出されたX軸方向におけるパターン位置が、視差ずれと表され、この視差ずれから奥行き座標を取得することにより、算出対象となる明暗コード30に対応する3次元位置を算出できることを意味する。 This is because the pattern position in the X-axis direction actually detected with respect to the pattern reflected light (expected position) on the virtual reference plane (reflecting surface) Zmin is expressed as a parallax shift, and the depth coordinate is calculated from this parallax shift. This means that the three-dimensional position corresponding to the light / dark code 30 to be calculated can be calculated.
 以下、本実施の形態に係る3次元形状計測装置10の動作について説明する。図6は、本実施の形態に係る3次元形状計測装置10の動作について説明するためのフロー図である。特に、図6においては、3次元形状計測装置10の処理装置13の動作について示している。なお、図6に示すフローにおいては、本実施の形態に係る3次元形状計測装置10が搭載される装置本体の制御部から3次元形状計測が指示されるものとする。 Hereinafter, the operation of the three-dimensional shape measurement apparatus 10 according to the present embodiment will be described. FIG. 6 is a flowchart for explaining the operation of the three-dimensional shape measuring apparatus 10 according to the present embodiment. In particular, FIG. 6 shows the operation of the processing device 13 of the three-dimensional shape measuring apparatus 10. In the flow shown in FIG. 6, it is assumed that three-dimensional shape measurement is instructed from the control unit of the apparatus main body on which the three-dimensional shape measurement apparatus 10 according to the present embodiment is mounted.
 本実施の形態に係る3次元形状計測装置10が搭載される装置本体の制御部から3次元形状計測が指示されると、投影装置11から図2に示すパターン光が計測対象20に投影され、パターン光が投影された計測対象20が撮影装置12により撮影される。撮影装置12によって撮影された画像データ(撮影画像)は、画像取込部132に出力される。これにより、処理装置13において、撮影画像が取得される(ステップ(以下、「ST」という)601)。 When the three-dimensional shape measurement is instructed from the control unit of the apparatus main body on which the three-dimensional shape measurement apparatus 10 according to the present embodiment is mounted, the pattern light shown in FIG. The measurement target 20 onto which the pattern light is projected is photographed by the photographing device 12. Image data (captured image) captured by the imaging device 12 is output to the image capturing unit 132. As a result, a captured image is acquired in the processing device 13 (step (hereinafter referred to as “ST”) 601).
 画像取込部132により取り込まれた撮影画像は、演算部133に出力される。撮影画像を受け取ると、演算部133においては、コード抽出部133aが撮影画像に含まれる明暗コード30の中心要素31の中心点情報(中心座標情報)を抽出する(ST602)。コード抽出部133aは、抽出した明暗コード30の中心要素31の中心点情報から、該当する明暗コード30の位置情報を取得する。コード抽出部133aにより抽出された明暗コード30の位置情報は、コード識別部133bに出力される。 The captured image captured by the image capturing unit 132 is output to the computing unit 133. When the captured image is received, in the calculation unit 133, the code extraction unit 133a extracts the center point information (center coordinate information) of the center element 31 of the light / dark code 30 included in the captured image (ST602). The code extraction unit 133a acquires the position information of the corresponding light / dark code 30 from the center point information of the central element 31 of the extracted light / dark code 30. The position information of the light / dark code 30 extracted by the code extraction unit 133a is output to the code identification unit 133b.
 このように本実施の形態に係る3次元形状計測装置10においては、コード抽出部133aにより明暗コード30の中心要素31を抽出し、当該中心要素31の中心点情報より明暗コード30の位置情報を取得する。これにより、中心要素31の位置により明暗コード30の位置情報が取得されることから、明暗コード30全体の形状を認識する必要がなく、そのための処理を省略できるので、コードの検出及び識別に必要となる演算コストの上昇を更に抑制することが可能となる。 As described above, in the three-dimensional shape measuring apparatus 10 according to the present embodiment, the central element 31 of the light / dark code 30 is extracted by the code extraction unit 133a, and the position information of the light / dark code 30 is obtained from the central point information of the central element 31. get. As a result, the position information of the light / dark code 30 is acquired based on the position of the central element 31, so that it is not necessary to recognize the shape of the entire light / dark code 30, and the processing for this can be omitted, so it is necessary for detecting and identifying the code It is possible to further suppress the increase in the calculation cost.
 明暗コード30の位置情報を受け取ると、コード識別部133bは、明暗コード30の種類を識別する(ST603)。コード識別部133bは、中心要素31の周辺に配置されるアーム32a~32dの有無により明暗コード30の種類を識別する。この場合、コード識別部133bは、識別対象の明暗コード30のコード群300における位置を特定する。明暗コード30の種類及びコード群300における明暗コード30の位置は、対応点算出部133cに出力される。 Upon receiving the position information of the light / dark code 30, the code identifying unit 133b identifies the type of the light / dark code 30 (ST603). The code identification unit 133b identifies the type of the light / dark code 30 based on the presence / absence of the arms 32a to 32d arranged around the central element 31. In this case, the code identifying unit 133b identifies the position in the code group 300 of the light / dark code 30 to be identified. The type of the light / dark code 30 and the position of the light / dark code 30 in the code group 300 are output to the corresponding point calculation unit 133c.
 コード群300における明暗コード30の位置等の情報を受け取ると、対応点算出部133cは、投影装置131が投影した投影画像における算出対象の明暗コード30の対応点を算出する(ST604)。この場合、投影画像におけるコード群300の該当する明暗コード30の位置と、コード識別部133bからの明暗コード30の位置との比較により投影画像における算出対象の明暗コード30の対応点が算出される。そして、対応点算出部133cにより算出された対応点情報は、3次元位置算出部133dに出力される。 Upon receiving information such as the position of the light / dark code 30 in the code group 300, the corresponding point calculation unit 133c calculates the corresponding point of the light / dark code 30 to be calculated in the projection image projected by the projection device 131 (ST604). In this case, the corresponding point of the light / dark code 30 to be calculated in the projection image is calculated by comparing the position of the corresponding light / dark code 30 in the code group 300 in the projection image with the position of the light / dark code 30 from the code identification unit 133b. . Then, the corresponding point information calculated by the corresponding point calculating unit 133c is output to the three-dimensional position calculating unit 133d.
 対応点情報を受け取ると、3次元位置算出部133dは、算出対象となる明暗コード30に対応する3次元位置を算出する(ST605)。3次元位置算出部133dは、対応する明暗コード30の中心要素31における仮想基準面からの反射値と実際の位置からの反射値とのずれから奥行き座標を取得し、その奥行座標から算出対象となる明暗コード30に対応する3次元位置を算出する。3次元位置算出部133dによって算出された3次元位置は、3次元形状計測装置10が搭載される装置本体の制御部に出力される。 When the corresponding point information is received, the three-dimensional position calculation unit 133d calculates a three-dimensional position corresponding to the light / dark code 30 to be calculated (ST605). The three-dimensional position calculation unit 133d acquires the depth coordinate from the deviation between the reflection value from the virtual reference plane and the reflection value from the actual position in the center element 31 of the corresponding light / dark code 30, and calculates the calculation target from the depth coordinate. A three-dimensional position corresponding to the light / dark code 30 is calculated. The three-dimensional position calculated by the three-dimensional position calculation unit 133d is output to the control unit of the apparatus main body on which the three-dimensional shape measurement apparatus 10 is mounted.
 撮像画像におけるある明暗コード30に対するST602~ST605の処理が完了すると、演算部133においては、撮影画像の全面(すなわち、全ての明暗コード30)について処理を完了したかを判定する(ST606)。撮影画像の全面について処理が完了していない場合、演算部133は、処理をST602に戻し、ST602~ST606の処理を繰り返す。一方、撮影画像の全面について処理が完了している場合、演算部133は、処理を終了する。 When the processing of ST602 to ST605 for a certain light / dark code 30 in the captured image is completed, the calculation unit 133 determines whether the processing has been completed for the entire captured image (that is, all the light / dark codes 30) (ST606). If the process has not been completed for the entire captured image, operation unit 133 returns the process to ST602 and repeats the processes of ST602 to ST606. On the other hand, when the process has been completed for the entire surface of the captured image, the arithmetic unit 133 ends the process.
 このように本実施の形態に係る3次元形状計測装置10においては、1回のパターン光の投影及び撮影により計測対象20の3次元形状を計測する際に利用されるパターン光に中心要素31と中心要素31の周りに配置されるアーム32とからなる複数の明暗コード30が配列され、その明暗コード30の種類がアーム32の配置により識別される。このため、図形コードを用いる場合と比べて検出及び識別のために1つのコードに必要となる画素数を低減できる。これにより、撮像素子にて単位面積当たりに検出できるコードの数(コードの密度)を増やすことができ、空間分解能を向上できる。この結果、空間分解能を確保しつつ、1回のパターン光の投影及び撮影動作により計測対象20の3次元形状を計測することが可能となる。 Thus, in the three-dimensional shape measurement apparatus 10 according to the present embodiment, the central element 31 and the pattern light used when measuring the three-dimensional shape of the measurement target 20 by projecting and photographing the pattern light once. A plurality of light / dark codes 30 including arms 32 arranged around the central element 31 are arranged, and the type of the light / dark code 30 is identified by the arrangement of the arms 32. For this reason, compared with the case where a figure code is used, the number of pixels required for one code | cord | chord for a detection and identification can be reduced. Thereby, the number of codes (code density) that can be detected per unit area by the image sensor can be increased, and the spatial resolution can be improved. As a result, it is possible to measure the three-dimensional shape of the measurement target 20 by one pattern light projection and photographing operation while ensuring spatial resolution.
 また、本実施の形態に3次元形状計測装置10においては、計測対象20に投影されるパターン光が、複数の明暗コード30をエピポーラ面に対して平行な方向(図2に示すX軸方向)に所定の数だけ配列したコード群300を構成単位として、この構成単位300が繰り返し配列されてなるパターンで構成されている。このため、パターン光が、エピポーラ面に対して平行な方向のみにコード化されることから、2次元配列パターン(2次元ランダム点群パターン)のように、エピポーラ面に対して直交する2方向(エピポーラ面に対して平行な方向及び垂直な方向)にコード化される場合と比較して、コード数を少なくできるので、全てのコードを表すために必要なビット数を低減できる。これに伴い、投影後にコードがエピポーラ面に対して平行な方向(第1の方向)に動いた時に隣のコード群における同一のコードと区別できる幅(投影パターンと撮影パターンとの視差ずれを弁別できる弁別可能幅)を少ないビット数で長く取ることができる。これにより、奥行き方向の計測レンジを長く確保でき、演算コストの上昇を抑制することが可能となる。 Further, in the three-dimensional shape measuring apparatus 10 according to the present embodiment, the pattern light projected onto the measurement target 20 is a direction parallel to the epipolar plane of the plurality of light / dark codes 30 (X-axis direction shown in FIG. 2). A code group 300 arranged in a predetermined number is used as a constituent unit, and the constituent unit 300 is repeatedly arranged. For this reason, since the pattern light is encoded only in a direction parallel to the epipolar plane, two directions orthogonal to the epipolar plane (two-dimensional array pattern (two-dimensional random point cloud pattern)) ( Since the number of codes can be reduced as compared with the case of encoding in a direction parallel to and perpendicular to the epipolar plane, the number of bits necessary to represent all codes can be reduced. Along with this, when the code moves in the direction parallel to the epipolar plane (first direction) after projection, it can be distinguished from the same code in the adjacent code group (disparity between the parallax deviation between the projection pattern and the shooting pattern) The discriminable width) can be long with a small number of bits. As a result, a long measurement range in the depth direction can be secured, and an increase in calculation cost can be suppressed.
 特に、本実施の形態に3次元形状計測装置10において、計測対象20に投影されるパターン光は、隣り合う明暗コード30の中心要素31間に配置される共通のアーム32を有し、連続的につながって配列されている。これにより、隣り合う明暗コード30間においてアーム32が共有されることから、パターン光に配列される明暗コード30の密度を増やすことができる。これにより、撮影装置12のCCDイメージセンサ121にて検出できるコードの数を増やすことができ、空間分解能を更に向上することが可能となる。 In particular, in the three-dimensional shape measuring apparatus 10 according to the present embodiment, the pattern light projected onto the measurement target 20 has a common arm 32 disposed between the central elements 31 of the adjacent light and dark codes 30 and is continuous. It is connected to. Thereby, since the arm 32 is shared between the adjacent light and dark cords 30, the density of the light and dark cords 30 arranged in the pattern light can be increased. Thereby, the number of codes that can be detected by the CCD image sensor 121 of the photographing apparatus 12 can be increased, and the spatial resolution can be further improved.
 さらに、本実施の形態に3次元形状計測装置10において、明暗コード30は、中心要素31からみたアーム32a~32dの延出方向が明暗コード30の配列方向(図2に示すX軸方向)に対して斜め方向となるようにアーム32a~32dが配置されている。これにより、アーム32の延出方向が明暗コード30の配列方向に対して平行又は垂直な方向に配置される場合と比べてアーム32の長さを確保できる。これにより、明暗コード30における中心要素31の位置やアーム32a~32dの有無の識別性を向上することが可能となる。 Furthermore, in the three-dimensional shape measuring apparatus 10 according to the present embodiment, the light / dark cord 30 has the extending direction of the arms 32a to 32d viewed from the central element 31 in the arrangement direction of the light / dark cord 30 (X-axis direction shown in FIG. 2). On the other hand, the arms 32a to 32d are arranged in an oblique direction. Thereby, the length of the arm 32 can be secured as compared with the case where the extending direction of the arm 32 is arranged in a direction parallel or perpendicular to the arrangement direction of the light and dark cords 30. This makes it possible to improve the discrimination of the position of the central element 31 in the light / dark code 30 and the presence / absence of the arms 32a to 32d.
 なお、本発明は上記実施の形態に限定されず、様々に変更して実施可能である。上記実施の形態において、添付図面に図示されている構成などについては、これに限定されず、本発明の効果を発揮する範囲内で適宜変更が可能である。その他、本発明の目的の範囲を逸脱しない限りにおいて適宜変更して実施可能である。 In addition, this invention is not limited to the said embodiment, It can implement by changing variously. In the above-described embodiment, the configuration and the like illustrated in the accompanying drawings are not limited to this, and can be appropriately changed within a range in which the effect of the present invention is exhibited. In addition, various modifications can be made without departing from the scope of the object of the present invention.
 例えば、上記実施の形態においては、各明暗コード30が4ビットコードで構成される場合について説明している。しかしながら、各明暗コード30の構成については、これに限定されるものではなく適宜変更が可能である。例えば、表現する情報量に応じて、8種類の情報を表現可能な3ビットコードや、256種類の情報を表現可能な8ビットコードで明暗コード30を構成しても良い。この場合、明暗コード30が表現可能な情報量に応じてコード群300を構成する明暗コード30の数も適宜調整できる(すなわち、弁別可能幅を調整できる)。 For example, in the above-described embodiment, the case where each light / dark code 30 is constituted by a 4-bit code has been described. However, the configuration of each light / dark code 30 is not limited to this, and can be changed as appropriate. For example, the light / dark code 30 may be composed of a 3-bit code capable of expressing eight types of information or an 8-bit code capable of expressing 256 types of information according to the amount of information to be expressed. In this case, the number of light / dark codes 30 constituting the code group 300 can be adjusted as appropriate according to the amount of information that the light / dark code 30 can express (that is, the distinguishable width can be adjusted).
 また、上記実施の形態においては、パターン光に含まれる明暗コード30が複数のアーム32を周辺要素として備える場合について説明している。しかしながら、明暗コード30の周辺要素の構成については、これに限定されるものではなく適宜変更が可能である。例えば、周辺要素として単一のアーム32を備える構成としてもよい。また、上記実施の形態においては、パターン光に含まれる明暗コード30が円形状の中心要素31を備える場合について説明している。しかしながら、明暗コード30の中心要素31の構成については、これに限定されるものではなく適宜変更が可能である。例えば、矩形状の中心要素31を備える構成としてもよい。 In the above embodiment, the case where the light / dark code 30 included in the pattern light includes a plurality of arms 32 as peripheral elements has been described. However, the configuration of the peripheral elements of the light / dark code 30 is not limited to this and can be appropriately changed. For example, it is good also as a structure provided with the single arm 32 as a peripheral element. Moreover, in the said embodiment, the case where the light-and-dark code 30 contained in pattern light is provided with the circular center element 31 is demonstrated. However, the configuration of the central element 31 of the light / dark cord 30 is not limited to this and can be changed as appropriate. For example, it is good also as a structure provided with the rectangular-shaped center element 31. FIG.
 その他、明暗コード30については、様々な態様が考えられる。図7は、本実施の形態の変形例に係る明暗コード30の説明図である。図7Aにおいては、隣り合う明暗コード30の中心要素31と接続されない態様の明暗コード30を示している。図7B及び図7Cは、明暗コード30を構成する中心要素と、その周囲に配置される周辺要素とが同一形状を有する態様について示している。図7Bにおいては、中心要素と周辺要素とがいずれも円形状を有する場合について示し、図7Cにおいては、中心要素と周辺要素とがいずれも正方形状を有する場合について示している。図7Dにおいては、中心要素31の周囲に8個の周辺要素が配置される場合について示している。すなわち、図7Dにおいては、8ビットコードの明暗コード30を示している。これらのように明暗コード30の態様を変形した場合においても、上記実施の形態と同様に、空間分解能を確保しつつ、1回のパターン光の投影及び撮影動作により計測対象の3次元形状を計測することが可能となる。 In addition, various modes can be considered for the light / dark code 30. FIG. 7 is an explanatory diagram of a light / dark code 30 according to a modification of the present embodiment. In FIG. 7A, the light / dark cord 30 in a mode not connected to the central element 31 of the adjacent light / dark cord 30 is shown. 7B and 7C show an aspect in which the central element constituting the light / dark cord 30 and the peripheral elements arranged around the central element have the same shape. FIG. 7B shows a case where both the central element and the peripheral element have a circular shape, and FIG. 7C shows a case where both the central element and the peripheral element have a square shape. FIG. 7D shows a case where eight peripheral elements are arranged around the central element 31. That is, in FIG. 7D, an 8-bit code light / dark code 30 is shown. Even when the form of the light / dark code 30 is modified as described above, the three-dimensional shape of the measurement object is measured by one pattern light projection and photographing operation while ensuring the spatial resolution, as in the above embodiment. It becomes possible to do.
 なお、上記実施の形態においては、1回の撮像によって、その瞬間における3次元形状を計測する3次元形状計測方法について説明している。しかしながら、このような3次元形状計測方法により所定時間間隔で3次元形状を計測することにより、3次元形状の動的計測(モーション入力)を実現することも可能である。 In the above embodiment, a three-dimensional shape measurement method for measuring a three-dimensional shape at the moment by one imaging is described. However, it is also possible to realize dynamic measurement (motion input) of a three-dimensional shape by measuring the three-dimensional shape at a predetermined time interval by such a three-dimensional shape measurement method.
 また、上記実施の形態においては、明暗コード30の配列方向が、エピポーラ面に対して平行な方向である場合について説明している。しかしながら、明暗コード30の配列方向については、これに限定されるものではなく適宜変更が可能である。例えば、エピポーラ面に対して僅かに傾斜した方向に沿って明暗コード30を配列してもよい。このように変更する場合においても、上記実施の形態と同様の効果を得ることができる。 In the above embodiment, the case where the arrangement direction of the light / dark code 30 is parallel to the epipolar plane has been described. However, the arrangement direction of the light / dark code 30 is not limited to this and can be changed as appropriate. For example, the light / dark code 30 may be arranged along a direction slightly inclined with respect to the epipolar plane. Even in the case of such changes, the same effects as in the above embodiment can be obtained.
 10 3次元形状計測装置
 11 投影装置
 12 撮影装置
 13 処理装置
 131 投影装置制御部
 132 画像取込部
 133 演算部
 133a コード抽出部
 133b コード識別部
 133c 対応点算出部
 133d 3次元位置算出部
 20 計測対象
 30、30a~30c 明暗コード
 31 中心要素
 32、32a~32d アーム
 300 コード群
 301~305 明暗コード
DESCRIPTION OF SYMBOLS 10 3D shape measuring device 11 Projection device 12 Imaging device 13 Processing device 131 Projection device control part 132 Image taking part 133 Calculation part 133a Code extraction part 133b Code identification part 133c Corresponding point calculation part 133d 3D position calculation part 20 Measurement object 30, 30a-30c Light / dark code 31 Central element 32, 32a-32d Arm 300 Cord group 301-305 Light / dark code

Claims (11)

  1.  複数の明暗コードが配列されたパターン光を計測対象に投影する投影ステップと、前記パターン光が投影された前記計測対象を撮影する撮影ステップと、前記計測対象を撮影した撮影画像から前記明暗コードを抽出するコード抽出ステップと、抽出した前記明暗コードの種類を識別するコード識別ステップと、識別したそれぞれの前記明暗コードの位置情報より3次元位置を算出する3次元位置算出ステップと、を具備する3次元形状計測方法であって、
     前記明暗コードは、中心要素と前記中心要素の周りに配置される1つ又は複数の周辺要素からなり、前記周辺要素の配置により当該明暗コードの種類が識別されることを特徴とする3次元形状計測方法。
    A projection step of projecting pattern light on which a plurality of light and dark codes are arranged onto a measurement object, a photographing step of photographing the measurement object onto which the pattern light is projected, and the light and dark code from a photographed image obtained by photographing the measurement object A code extracting step for extracting, a code identifying step for identifying the type of the extracted light / dark code, and a three-dimensional position calculating step for calculating a three-dimensional position from position information of each identified light / dark code. A dimension shape measuring method,
    The light / dark code is composed of a central element and one or a plurality of peripheral elements arranged around the central element, and the type of the light / dark code is identified by the arrangement of the peripheral elements. Measurement method.
  2.  前記パターン光は、複数の前記明暗コードを第1の方向に所定の数だけ配列したコード群を構成単位として、前記構成単位が前記第1の方向に対して垂直な第2の方向に繰り返し配列されてなるパターンであることを特徴とする請求項1記載の3次元形状計測方法。 The pattern light includes a code group in which a predetermined number of light and dark codes are arranged in a first direction as a constituent unit, and the constituent units are repeatedly arranged in a second direction perpendicular to the first direction. The three-dimensional shape measuring method according to claim 1, wherein the pattern is a formed pattern.
  3.  前記パターン光は、複数の前記明暗コードをエピポーラ面に対して平行な前記第1の方向に所定の数だけ配列したコード群を構成単位として、前記構成単位が前記第1の方向及びエピポーラ面に対して垂直な前記第2の方向に繰り返し配列されてなるパターンであることを特徴とする請求項2記載の3次元形状計測方法。 The pattern light includes, as a structural unit, a code group in which a predetermined number of the light and dark codes are arranged in the first direction parallel to the epipolar plane, and the structural unit is arranged in the first direction and the epipolar plane. 3. The three-dimensional shape measuring method according to claim 2, wherein the pattern is a pattern that is repeatedly arranged in the second direction perpendicular to the second direction.
  4.  前記パターン光は、隣り合う前記明暗コードの前記中心要素間に配置される共通の前記周辺要素を有し、隣り合う前記明暗コードが連続的につながって配列されていることを特徴とする請求項1から請求項3のいずれかに記載の3次元形状計測方法。 The pattern light has the common peripheral element arranged between the central elements of the adjacent light and dark codes, and the adjacent light and dark codes are continuously connected and arranged. The three-dimensional shape measuring method according to any one of claims 1 to 3.
  5.  前記コード抽出ステップにおいて、前記明暗コードの前記中心要素を抽出し、当該中心要素の位置より前記明暗コードの位置情報を取得することを特徴とする請求項1から請求項4のいずれかに記載の3次元形状計測方法。 5. The code extraction step according to claim 1, wherein the central element of the light / dark code is extracted and position information of the light / dark code is acquired from a position of the central element. 3D shape measurement method.
  6.  前記明暗コードは、前記中心要素の周りに前記周辺要素が配置される位置を4箇所有することを特徴とする請求項1から請求項5のいずれかに記載の3次元形状計測方法。 The three-dimensional shape measuring method according to any one of claims 1 to 5, wherein the light / dark code has four positions where the peripheral elements are arranged around the central element.
  7.  前記明暗コードは、前記中心要素からみた前記周辺要素の方向が前記明暗コードの配列方向に対して斜め方向となるように前記周辺要素が配置されることを特徴とする請求項1から請求項6のいずれかに記載の3次元形状計測方法。 7. The light and dark code, wherein the peripheral elements are arranged such that a direction of the peripheral elements viewed from the central element is an oblique direction with respect to an arrangement direction of the light and dark codes. The three-dimensional shape measurement method according to any one of the above.
  8.  前記投影ステップで投影される前記パターン光は、可視光以外の波長を有することを特徴とする請求項1から請求項7のいずれかに記載の3次元形状計測方法。 The three-dimensional shape measuring method according to any one of claims 1 to 7, wherein the pattern light projected in the projecting step has a wavelength other than visible light.
  9.  複数の明暗コードが配列されたパターン光を計測対象に投影する投影部と、前記パターン光が投影された前記計測対象を撮影する撮影部と、前記計測対象を撮影した撮影画像から前記明暗コードを抽出する抽出部と、前記抽出部で抽出した前記明暗コードの種類を識別する識別部と、前記識別部で識別したそれぞれの前記明暗コードの位置情報より3次元位置を算出する算出部と、を具備する3次元形状計測装置であって、
     前記明暗コードは、中心要素と前記中心要素の周りに配置される1つ又は複数の周辺要素からなり、前記周辺要素の配置により当該明暗コードの種類が識別されることを特徴とする3次元形状計測装置。
    A projection unit that projects pattern light on which a plurality of light and dark codes are arranged onto a measurement target, a photographing unit that photographs the measurement target onto which the pattern light is projected, and the light and dark code from a photographed image obtained by photographing the measurement target. An extraction unit for extraction, an identification unit for identifying the type of the light / dark code extracted by the extraction unit, and a calculation unit for calculating a three-dimensional position from position information of each of the light / dark codes identified by the identification unit, A three-dimensional shape measuring apparatus comprising:
    The light / dark code is composed of a central element and one or a plurality of peripheral elements arranged around the central element, and the type of the light / dark code is identified by the arrangement of the peripheral elements. Measuring device.
  10.  前記投影部が投影する前記パターン光は、複数の前記明暗コードを第1の方向に所定の数だけ配列したコード群を構成単位として、前記構成単位が前記第1の方向に対して垂直な第2の方向に繰り返し配列されてなるパターンであることを特徴とする請求項9記載の3次元形状計測装置。 The pattern light projected by the projection unit includes a code group in which a predetermined number of the light and dark codes are arranged in a first direction as a constituent unit, and the constituent unit is perpendicular to the first direction. The three-dimensional shape measuring apparatus according to claim 9, wherein the three-dimensional shape measuring apparatus is a pattern that is repeatedly arranged in two directions.
  11.  前記投影部が投影する前記パターン光は、複数の前記明暗コードをエピポーラ面に対して平行な前記第1の方向に所定の数だけ配列したコード群を構成単位として、前記構成単位が前記第1の方向及びエピポーラ面に対して垂直な前記第2の方向に繰り返し配列されてなるパターンであることを特徴とする請求項10記載の3次元形状計測装置。 The pattern light projected by the projection unit includes a code group in which a predetermined number of the plurality of bright and dark codes are arranged in the first direction parallel to the epipolar plane, and the structural unit is the first light. The three-dimensional shape measuring apparatus according to claim 10, wherein the three-dimensional shape measuring apparatus is a pattern repeatedly arranged in the second direction perpendicular to the direction and the epipolar plane.
PCT/JP2014/073495 2013-09-05 2014-09-05 Three-dimensional shape measurement method and three-dimensional shape measurement device WO2015034048A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-183609 2013-09-05
JP2013183609 2013-09-05

Publications (1)

Publication Number Publication Date
WO2015034048A1 true WO2015034048A1 (en) 2015-03-12

Family

ID=52628509

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/073495 WO2015034048A1 (en) 2013-09-05 2014-09-05 Three-dimensional shape measurement method and three-dimensional shape measurement device

Country Status (1)

Country Link
WO (1) WO2015034048A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113204982A (en) * 2021-04-20 2021-08-03 Oppo广东移动通信有限公司 Three-dimensional graphic code, three-dimensional graphic code identification method, device, medium and equipment
EP3754296A4 (en) * 2018-02-14 2021-11-10 Omron Corporation THREE-DIMENSIONAL MEASUREMENT DEVICE, THREE-DIMENSIONAL MEASUREMENT METHOD AND PROGRAM
CN116152189A (en) * 2023-01-31 2023-05-23 华纺股份有限公司 Pattern fabric flaw detection method, system and detection terminal

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001082940A (en) * 1999-09-14 2001-03-30 Sanyo Electric Co Ltd Apparatus and method for generating three-dimensional model
JP2002122416A (en) * 2000-10-16 2002-04-26 Sumitomo Osaka Cement Co Ltd Three-dimensional shape measuring device
JP2006092271A (en) * 2004-09-24 2006-04-06 Canon Inc Feature identification device, its method, program code and storage medium
JP2008052403A (en) * 2006-08-23 2008-03-06 Univ Of Tokyo Pattern in which two-dimensional position information is coded, position identification system and method using the pattern
JP2012504771A (en) * 2008-10-06 2012-02-23 マンチスビジョン リミテッド Method and system for providing three-dimensional and distance inter-surface estimation
US20120063672A1 (en) * 2006-11-21 2012-03-15 Mantis Vision Ltd. 3d geometric modeling and motion capture using both single and dual imaging
JP2012098265A (en) * 2010-11-02 2012-05-24 Beru Techno:Kk Measuring device of weight, shape, and other property

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001082940A (en) * 1999-09-14 2001-03-30 Sanyo Electric Co Ltd Apparatus and method for generating three-dimensional model
JP2002122416A (en) * 2000-10-16 2002-04-26 Sumitomo Osaka Cement Co Ltd Three-dimensional shape measuring device
JP2006092271A (en) * 2004-09-24 2006-04-06 Canon Inc Feature identification device, its method, program code and storage medium
JP2008052403A (en) * 2006-08-23 2008-03-06 Univ Of Tokyo Pattern in which two-dimensional position information is coded, position identification system and method using the pattern
US20120063672A1 (en) * 2006-11-21 2012-03-15 Mantis Vision Ltd. 3d geometric modeling and motion capture using both single and dual imaging
JP2012504771A (en) * 2008-10-06 2012-02-23 マンチスビジョン リミテッド Method and system for providing three-dimensional and distance inter-surface estimation
JP2012098265A (en) * 2010-11-02 2012-05-24 Beru Techno:Kk Measuring device of weight, shape, and other property

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
P. VUYLSTEKE ET AL.: "Range Image Acquisition with a Single Binary-Encoded Light Pattern", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 12, no. 2, February 1990 (1990-02-01), pages 148 - 164 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3754296A4 (en) * 2018-02-14 2021-11-10 Omron Corporation THREE-DIMENSIONAL MEASUREMENT DEVICE, THREE-DIMENSIONAL MEASUREMENT METHOD AND PROGRAM
US11321860B2 (en) 2018-02-14 2022-05-03 Omron Corporation Three-dimensional measurement apparatus, three-dimensional measurement method and non-transitory computer readable medium
CN113204982A (en) * 2021-04-20 2021-08-03 Oppo广东移动通信有限公司 Three-dimensional graphic code, three-dimensional graphic code identification method, device, medium and equipment
CN116152189A (en) * 2023-01-31 2023-05-23 华纺股份有限公司 Pattern fabric flaw detection method, system and detection terminal
CN116152189B (en) * 2023-01-31 2023-12-19 华纺股份有限公司 Pattern fabric flaw detection method, system and detection terminal

Similar Documents

Publication Publication Date Title
JP5576726B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method, and program
CN109751973B (en) Three-dimensional measuring device, three-dimensional measuring method, and storage medium
US10902668B2 (en) 3D geometric modeling and 3D video content creation
US10768435B2 (en) System, method and computer program product to project light pattern
CN103069250B (en) 3-D measuring apparatus, method for three-dimensional measurement
JP6302414B2 (en) Motion sensor device having a plurality of light sources
US20070126735A1 (en) Method and apparatus for 3-D data input to a personal computer with a multimedia oriented operating system
CN107480613A (en) Face identification method, device, mobile terminal and computer-readable recording medium
WO2011013373A1 (en) Measuring apparatus, measuring method, and program
CN107483845B (en) Photographing method and device
CN113748313B (en) Three-dimensional measurement system and three-dimensional measurement method
JP2016136321A (en) Object detection device and object detection method
JP6009206B2 (en) 3D measuring device
EP3371780A1 (en) System and methods for imaging three-dimensional objects
WO2015034048A1 (en) Three-dimensional shape measurement method and three-dimensional shape measurement device
JP6466679B2 (en) Object detection device
JP5968370B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method, and program
JP2011252835A (en) Three dimensional shape measuring device
JP4351090B2 (en) Image processing apparatus and image processing method
WO2015034049A1 (en) Three-dimensional shape measurement method and three-dimensional shape measurement device
JP2006058092A (en) Three-dimensional shape measuring device and method
JP2012098207A (en) Position measuring device, position measuring method and marker
CN104052982B (en) The scaling method and equipment of integration imaging display
JP2016121916A (en) Shape measurement device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14841530

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14841530

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP