[go: up one dir, main page]

US10732284B2 - Live metrology of an object during manufacturing or other operations - Google Patents

Live metrology of an object during manufacturing or other operations Download PDF

Info

Publication number
US10732284B2
US10732284B2 US15/663,397 US201715663397A US10732284B2 US 10732284 B2 US10732284 B2 US 10732284B2 US 201715663397 A US201715663397 A US 201715663397A US 10732284 B2 US10732284 B2 US 10732284B2
Authority
US
United States
Prior art keywords
sensors
scanning
point cloud
polarization
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/663,397
Other versions
US20190033461A1 (en
Inventor
Liam Antonio Wingert
Chris A. Cantrell
Anthony W. Baker
Kenneth Paul Bowers, III
James A. Grossnickle
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boeing Co
Original Assignee
Boeing Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boeing Co filed Critical Boeing Co
Priority to US15/663,397 priority Critical patent/US10732284B2/en
Assigned to THE BOEING COMPANY reassignment THE BOEING COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAKER, ANTHONY W., BOWERS, KENNETH PAUL, III, CANTRELL, Chris A., GROSSNICKLE, JAMES A., WINGERT, Liam Antonio
Priority to ES18173753T priority patent/ES2757561T3/en
Priority to EP18173753.7A priority patent/EP3435028B1/en
Priority to KR1020180071265A priority patent/KR102643295B1/en
Priority to JP2018136443A priority patent/JP7294778B2/en
Publication of US20190033461A1 publication Critical patent/US20190033461A1/en
Application granted granted Critical
Publication of US10732284B2 publication Critical patent/US10732284B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination

Definitions

  • the present disclosure relates to inspecting an object and performing measurements or metrology of the object, and more particularly to live or near real-time metrology of an object during manufacturing or other operations.
  • a system for live metrology of an object includes a plurality of sensors for performing a scanning operation to collect electronic images of an object.
  • the electronic images include 3-D point cloud data for live metrology of the object.
  • the point cloud data from each sensor defines a point cloud that represents the object.
  • the system also includes a processor and a live metrology module operating on the processor.
  • the live metrology module is configured to perform a set of functions including stitching the point clouds together from the plurality of sensors to generate a reconstructed model of an as-manufactured object.
  • the set of functions also includes comparing the reconstructed model of the as-manufactured object to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.
  • a computer program product for live metrology of an object includes a computer readable storage medium having program instructions embodied therewith.
  • the computer readable storage medium is not a transitory medium per se.
  • the program instructions are executable by a device to cause the device to perform a method including performing a scanning operation by a plurality of sensors to collect electronic images of an object.
  • the electronic images include 3-D point cloud data for live metrology of the object.
  • the point cloud data from each sensor defines a point cloud that represents the object.
  • the method also includes stitching the point clouds from the plurality of sensors together to generate a reconstructed model of an as-manufactured object.
  • the method further includes comparing the reconstructed model of the as-manufactured object to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.
  • the method or set of functions further includes placing the plurality of sensors in a predetermined array or arrays of different types of sensors.
  • the method or set of functions further includes placing the plurality of sensors a predetermined distance from the object to avoid interference from humans and equipment during manufacturing.
  • the method or set of functions further includes determining a plurality of scanning areas using the depth map.
  • mapping the surface images to the conjoined polarization sensor map includes generating a representation of the object using the polarization point clouds and the surface image point clouds.
  • the surface representation of the object includes a resolution adaptive mesh corresponding to the object for 3-D metrology of the object.
  • stitching the point clouds together includes generating a resolution adaptive mesh using the point clouds.
  • the resolution adaptive mesh corresponding to the reconstructed model of the as-manufactured object for 3-D metrology of the object.
  • stitching the point clouds together includes fitting a mesh to the point clouds using an intermediate implicit representation of each point cloud.
  • the method or set of functions wherein an accuracy of the reconstructed model of the as-manufactured object is within about 0.003 inches to an actual manufactured object.
  • the method or set of functions further including providing the reconstructed model of the as-manufactured object as an input to a machine controller of a machine to avoid inadvertent contact by the machine.
  • the plurality of sensors includes a one or more of an array of Light Detection and Ranging (LIDAR) devices or system, an array of special phase imaging sensors, an array of time-of-flight sensors or cameras, an array of stereoscopic cameras, an array of light field cameras, and an array of high resolution cameras or other type devices or sensors capable of performing the functions described herein.
  • LIDAR Light Detection and Ranging
  • FIG. 1 is a block schematic diagram of an example of a system for performing live metrology of an object in accordance with an embodiment of the present disclosure.
  • FIG. 3 is flow chart of an example of a method for performing a scanning operation to generate a surface representation of an object in accordance with an embodiment of the present disclosure.
  • FIG. 4 is an illustration of an example of a system for performing live metrology of an object in accordance with another embodiment of the present disclosure.
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • the TOF sensors 116 collect images 106 including depth “D” or range data of points 122 or 3-D locations on a surface 124 of the object 102 from the sensors 104 during a scanning operation.
  • the TOF sensors 116 determine the depth D or range by reflecting an infrared light beam on the object 102 and measuring a time from transmitting the light beam to return of the reflected light beam from the object 102 .
  • a point cloud 110 is generated by each TOF sensor 116 that includes the depth information of points 122 on the object 102 .
  • the point clouds 110 generated by the TOF sensors 116 define a depth map 126 for use in generating the reconstructed model 112 of the as-manufactured object 102 as described herein.
  • the electronic images 106 include 3-D point cloud data 108 of the object 102 .
  • the point cloud data 108 from each sensor 104 defines a point cloud 110 that represents the object 102 .
  • Each point cloud 110 includes a multiplicity of points 132 and each point 132 includes at least location information for a corresponding point 122 on the surface 124 of the object 102 .
  • the point cloud data 108 or point clouds 110 from each sensor 104 are stored in a database 130 or other data storage device.
  • the object 102 is an aircraft or portion of an aircraft and the reconstructed model 112 of the as-manufactured object 102 , as described herein, is used for 3-D live metrology of the aircraft or portion of the aircraft during assembly or manufacturing or for some other purpose.
  • the object 102 is any product or item or portion of a product or item and the reconstructed model 112 , as described herein, is used to perform 3-D live metrology on the device or equipment during assembly, manufacturing or other operation.
  • the system 100 also includes a processor 134 and a live metrology module 136 operating on the processor 134 .
  • the live metrology model is configured to perform the operations or functions described with reference to FIGS. 2 and 3 .
  • the live metrology module 136 is configured to perform a set of functions including but not limited to stitching the point clouds 110 from the plurality of sensors 104 together to generate the reconstructed model 112 of the as-manufactured object 112 and to compare the reconstructed model 112 to an as-designed model 138 of the object 102 to determine that the object 102 is being manufactured within allowable tolerances 140 to the as-designed model 138 of the object 102 .
  • the reconstructed model 112 is generated near real-time to performing the scanning operation described with reference to FIGS. 2 and 3 .
  • the system 100 also includes an apparatus 144 for positioning the object 102 in different orientations or positions for scanning.
  • the object 102 may need to be positioned with different surfaces or sides facing the sensors 104 so that a reconstructed model 112 of the complete or entire as-manufactured object 102 can be generated or formed.
  • the apparatus 144 is also sized so that humans or equipment 142 used in manufacturing, assembly or other functions associated with the object 102 are movable with respect to the apparatus 144 .
  • the processor 134 controls operation of the 3-D scanning sensors 114 , time-of-flight sensors 116 , polarization sensors 118 or scanning sensors, and structured light projectors 120 . In accordance with another embodiment, the processor 134 or another processor also controls apparatus 144 for positioning the object 102 for scanning.
  • FIG. 2 is a flow chart of an example of a method 200 for live metrology of an object in accordance with an embodiment of the present disclosure.
  • the method 200 is embodied in and performed by the system 100 in FIG. 1 .
  • the sensors are calibrated.
  • a known or standard object which has verified dimensions to an as-designed object may be placed in the system 100 to calibrate the sensors to confirm that they are operating properly and are collecting accurate measurement data.
  • Calibration also collects parameters (intrinsic and extrinsic parameters) used by the live metrology module 136 to properly run.
  • An accuracy of the reconstructed model of the as-manufactured object is preferably within about 0.003 inches to the actual manufactured object being measured or for which the reconstructed model is being generated.
  • the object is positioned in a selected orientation for performing the scanning operation with a surface or side of the object to be scanned facing the sensors.
  • a scanning operation is performed by a plurality of sensors to collect electronic images of the object.
  • the electronic images include 3-D point cloud data for live 3-D metrology of the object.
  • the point cloud data from each sensor defines a point cloud that represents the object similar to that previously described.
  • An example of a scanning operation useable in block 206 will be described with reference to FIG. 3 .
  • the 3-D point cloud data is stored after performing the scanning operation.
  • the point cloud data or 3-D point cloud data is stored after performing each scanning operation.
  • This application discloses a method and system for representing and optimally combining point cloud data or 3-D point cloud data generated by a plurality of sensors or 3-D scanning system, similar to that described herein, and for measuring and tracking objects in large spatial volumes, such as for example, a factory during a manufacturing or assembly process of an object, such as an aircraft.
  • the system and method converts point clouds into a surface representation useful for metrology while taking into account the spatial varying resolutions in different directions of the plurality of sensors observing the same object.
  • the system and method does this by weighting the contribution of each point to the surface representation differently depending on the expected accuracy of the point in the point cloud. This is in contrast to existing surface reconstruction methods that weight all points equally and don't incorporate sensor resolution models or prior knowledge about the expected resolution of the sensors as a function of viewing direction and distance from the object.
  • the system and method optimizes the 3-D surface representation of the object or reconstructed model of the as-manufactured object derived from stitching or fusion of the point clouds from the plurality of sensors observing the same object but from different directions and distances.
  • stitching the point clouds together includes fitting a mesh to the point clouds using an intermediate implicit representation of each point cloud.
  • An example of stitching or fusing point clouds together using an intermediate implicit representation of each point cloud is described in U.S. application Ser. No. 15/663,243, entitled “Resolution Adaptive Mesh That Is Generated Using an Intermediate Implicit Representation of a Point Cloud,” which is assigned to the same assignee as the present application and is incorporated herein by reference.
  • the reconstructed model of the as-manufactured object is compared to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.
  • any deviation between the as-manufactured object and the as-designed object is determined using the reconstructed model of the as-manufactured object.
  • the reconstructed model is used for 3-D metrology of the object similar to that described in U.S. application Ser. No. 15/663,190, entitled “Resolution Adaptive Mesh for Performing 3-D Metrology of an Object” and U.S. application Ser. No. 15/663,243, entitled “Resolution Adaptive Mesh That Is Generated Using an Intermediate Implicit Representation of a Point Cloud.”
  • An operation may be performed on the object, such as a manufacturing operation using the reconstructed model.
  • the reconstructed model of the as-manufactured object is used as an input to a machine controller of a machine to avoid inadvertent contact by the machine.
  • FIG. 3 is flow chart of an example of a method 300 for performing a scanning operation to generate a surface representation of an object in accordance with an embodiment of the present disclosure.
  • the method 300 is used for performing the scanning operation in block 206 in FIG. 2 .
  • the method 300 is embodied in and performed by the system 100 in FIG. 1 .
  • a depth map is created using a group of time-of-flight sensors of the plurality of sensors.
  • the depth map includes a point cloud of point cloud data including distances or ranges from the time-of-flight sensors to the object.
  • a plurality of scanning regions are illuminated using the depth map.
  • the scanning regions are illuminated by the structured light projectors 120 .
  • the object is scanned using a multiplicity of polarization sensors or other scanning sensors of the plurality of sensors using the depth map to collect a multiplicity of polarization images.
  • a plurality of scanning areas on the object are determined using the depth map.
  • Each polarization image defines a polarization point cloud generated by each polarization sensor.
  • the polarization or scanning sensors are mounted on an elevator for movement relative to the object for scanning. Elevator zones associated with the elevators are assigned by square footage to keep the areas defined by the elevator zones clear of objects during a scanning operation. Sensors are also mountable on pan/tilt units for performing scanning operations and areas associated with these are also kept clear of objects during scanning.
  • the polarization images are stitched or fused together to form a complete, conjoined or fused polarization sensor map.
  • the polarization images are stitched or fused together using techniques described in U.S. application Ser. No. 15/663,190 or U.S. application Ser. No. 15/663,243.
  • Stitching the polarization images together includes generating a representation of the object from the polarization point clouds.
  • the representation of the object includes a resolution adaptive mesh.
  • a plurality of surface images of the object are captured or collected using a plurality of 3-D scanning sensors of the plurality of sensors.
  • Each surface image defines a surface image point cloud generated by each of the 3-D scanning sensors of a surface or side of the object facing the 3-D scanning sensors.
  • FIG. 4 is an illustration of an example of a system 400 for performing live metrology of an object 404 in accordance with another embodiment of the present disclosure.
  • the system 400 is used for the system 100 in FIG. 1 or is part of the system 100 in FIG. 1 .
  • the system 400 includes an apparatus 402 for positioning an object 404 for scanning.
  • the apparatus 402 may be used for the apparatus 144 in FIG. 1 .
  • the apparatus 402 for positioning the object 404 includes a base 406 and a pair of opposite side walls 408 and 410 extending from the base 406 .
  • a gantry 412 extends between the tops of the sides walls 408 and 410 and moves along a track 414 on each side wall 408 and 410 .
  • the gantry 412 includes a crane 416 for positioning or repositioning the object 404 for scanning different surfaces or sides of the object 404 as described herein.
  • a first horizontal structural member 426 a includes a plurality of arrays of pan/tilt units 430 for scanning the object 404 and collecting electronic images including data for generating the reconstructed model of the as-manufactured object 404 .
  • the pan/tilt units 430 include but are not limited to 3-D scanning sensors or devices.
  • a second horizontal structural member 426 b includes a plurality of arrays of 3-D scanning sensors 432 or cameras.
  • a third horizontal structural member 426 c includes a plurality of arrays of time-of-flight sensors 434 and a fourth horizontal structural member 426 d includes a plurality of lighting projectors 422 or infrared lighting projectors.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)

Abstract

A method for live metrology of an object includes performing a scanning operation by a plurality of sensors to collect electronic images of an object. The electronic images include 3-D point cloud data for live metrology of the object and the point cloud data from each sensor define a point cloud that represents the object. The method also includes stitching the point clouds from the plurality of sensors to generate a reconstructed model of an as-manufactured object. The method further includes comparing the reconstructed model of the as-manufactured object to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is related to U.S. application Ser. No. 15/663,190, now U.S. Pat. No. 10,438,408, entitled “Resolution Adaptive Mesh for Performing 3-D Metrology of an Object,” filed the same date as the present application and is incorporated herein by reference.
This application is also related to U.S. application Ser. No. 15/663,243, now U.S. Pat. No. 10,354,444 entitled “Resolution Adaptive Mesh That Is Generated Using an Intermediate Implicit Representation of a Point Cloud,” filed the same date as the present application and is incorporated herein by reference.
FIELD
The present disclosure relates to inspecting an object and performing measurements or metrology of the object, and more particularly to live or near real-time metrology of an object during manufacturing or other operations.
BACKGROUND
Inspecting larger components or objects during manufacturing or other processes is a time-consuming and labor intensive operation which adds cost and time to the manufacturing process. Inaccuracies in measuring such components during manufacturing can also cause manufacturing inefficiencies and quality defects. Accordingly, there is a need for a system and method that overcomes these deficiencies, detects defects early in the production cycle, decreases inspection time and is useable for automation of manufacturing, inspection or other processes.
SUMMARY
In accordance with an embodiment, a method for live metrology of an object includes performing a scanning operation by a plurality of sensors to collect electronic images of an object. The electronic images include three-dimensional (3-D) point cloud data for live metrology of the object and the point cloud data from each sensor defines a point cloud that represents the object. The method also includes stitching the point clouds from the plurality of sensors together to generate a reconstructed model of an as-manufactured object. The method further includes comparing the reconstructed model of the as-manufactured object to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.
In accordance with another embodiment, a system for live metrology of an object includes a plurality of sensors for performing a scanning operation to collect electronic images of an object. The electronic images include 3-D point cloud data for live metrology of the object. The point cloud data from each sensor defines a point cloud that represents the object. The system also includes a processor and a live metrology module operating on the processor. The live metrology module is configured to perform a set of functions including stitching the point clouds together from the plurality of sensors to generate a reconstructed model of an as-manufactured object. The set of functions also includes comparing the reconstructed model of the as-manufactured object to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.
In accordance with a further embodiment, a computer program product for live metrology of an object includes a computer readable storage medium having program instructions embodied therewith. The computer readable storage medium is not a transitory medium per se. The program instructions are executable by a device to cause the device to perform a method including performing a scanning operation by a plurality of sensors to collect electronic images of an object. The electronic images include 3-D point cloud data for live metrology of the object. The point cloud data from each sensor defines a point cloud that represents the object. The method also includes stitching the point clouds from the plurality of sensors together to generate a reconstructed model of an as-manufactured object. The method further includes comparing the reconstructed model of the as-manufactured object to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.
In accordance with another embodiment or any of the previous embodiments, the method or set of functions further includes placing the plurality of sensors in a predetermined array or arrays of different types of sensors.
In accordance with another embodiment or any of the previous embodiments, the method or set of functions further includes placing the plurality of sensors a predetermined distance from the object to avoid interference from humans and equipment during manufacturing.
In accordance with another embodiment or any of the previous embodiments, the method or set of functions further includes positioning the object in a selected orientation for performing the scanning operation with a surface or side of the object to be scanned facing the sensors and storing the 3-D point cloud data after performing the scanning operation. The method or set of functions additionally includes repositioning the object in other selected orientations for scanning other surfaces or sides of the object and storing the 3-D point cloud data after performing each scanning operation. Stitching the point clouds together includes stitching the point clouds from each of the selected orientations of the object together.
In accordance with another embodiment or any of the previous embodiments, performing the scanning operation includes creating a depth map using a group of time-of-flight sensors of the plurality of sensors. The depth map includes a point cloud of point cloud data including distances from the time-of-flight sensors to the object. Performing the scanning operation also includes scanning the object using a multiplicity of polarization sensors or scanning sensors of the plurality of sensors using the depth map to collect a multiplicity of polarization images. Each polarization image defines a polarization point cloud generated by each polarization sensor. Performing the scanning operation also includes stitching the polarization images together to form a conjoined polarization sensor map. Performing the scanning operation additionally includes capturing a plurality of surface images of the object using a plurality of 3-D scanning sensors of the plurality of sensors. Each surface image defines a surface image point cloud generated by each of the 3-D scanning sensors of a surface or side of the object facing the 3-D scanning sensors. Performing the scanning operation additionally includes mapping the surface images to the conjoined polarization sensor map to generate a surface representation of the object of the surface or side of the object facing the plurality of sensors.
In accordance with another embodiment or any of the previous embodiments, the method or set of functions further includes illuminating a plurality of scanning regions using the depth map.
In accordance with another embodiment or any of the previous embodiments, the method or set of functions further includes determining a plurality of scanning areas using the depth map.
In accordance with another embodiment or any of the previous embodiments, stitching the polarization images together includes generating a representation of the object from the polarization point clouds. The representation of the object includes a resolution adaptive mesh.
In accordance with another embodiment or any of the previous embodiments, mapping the surface images to the conjoined polarization sensor map includes generating a representation of the object using the polarization point clouds and the surface image point clouds. The surface representation of the object includes a resolution adaptive mesh corresponding to the object for 3-D metrology of the object.
In accordance with another embodiment or any of the previous embodiments, generating the surface representation includes fitting a mesh to the polarization point clouds and the surface image point clouds using an intermediate implicit representation of each point cloud.
In accordance with another embodiment or any of the previous embodiments, stitching the point clouds together includes generating a resolution adaptive mesh using the point clouds. The resolution adaptive mesh corresponding to the reconstructed model of the as-manufactured object for 3-D metrology of the object.
In accordance with another embodiment or any of the previous embodiments, stitching the point clouds together includes fitting a mesh to the point clouds using an intermediate implicit representation of each point cloud.
In accordance with another embodiment or any of the previous embodiments, the method or set of functions, wherein an accuracy of the reconstructed model of the as-manufactured object is within about 0.003 inches to an actual manufactured object.
In accordance with another embodiment or any of the previous embodiments, the method or set of functions further including providing the reconstructed model of the as-manufactured object as an input to a machine controller of a machine to avoid inadvertent contact by the machine.
In accordance with another embodiment or any of the previous embodiments, wherein the plurality of sensors includes a one or more of an array of Light Detection and Ranging (LIDAR) devices or system, an array of special phase imaging sensors, an array of time-of-flight sensors or cameras, an array of stereoscopic cameras, an array of light field cameras, and an array of high resolution cameras or other type devices or sensors capable of performing the functions described herein.
In accordance with another embodiment or any of the previous embodiments, wherein the reconstructed model of the as-manufactured object is generated near real-time to performing the scanning operation.
The features, functions, and advantages that have been discussed can be achieved independently in various embodiments or may be combined in yet other embodiments further details of which can be seen with reference to the following description and drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block schematic diagram of an example of a system for performing live metrology of an object in accordance with an embodiment of the present disclosure.
FIG. 2 is a flow chart of an example of a method for live metrology of an object in accordance with an embodiment of the present disclosure.
FIG. 3 is flow chart of an example of a method for performing a scanning operation to generate a surface representation of an object in accordance with an embodiment of the present disclosure.
FIG. 4 is an illustration of an example of a system for performing live metrology of an object in accordance with another embodiment of the present disclosure.
DETAILED DESCRIPTION
The following detailed description of embodiments refers to the accompanying drawings, which illustrate specific embodiments of the disclosure. Other embodiments having different structures and operations do not depart from the scope of the present disclosure. Like reference numerals may refer to the same element or component in the different drawings.
The present disclosure may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
FIG. 1 is a block schematic diagram of an example of a system 100 for performing live metrology of an object 102 in accordance with an embodiment of the present disclosure. The system 100 includes a plurality of sensors 104 for performing scanning operations to collect electronic images 106 of the object 102. The object 102 is being measured, assembled, manufactured and/or some other operation is being performed on the object 102 that uses a reconstructed model 112 of the as-manufactured object 102 as described herein. An example of a scanning operation performable by the system 100 is described in more detail with reference to FIGS. 2 and 3. The electronic images 106 include three-dimension (3-D) point cloud data 108 for live metrology or live 3-D metrology of the object 102. The point cloud data 108 from each sensor 104 defines a point cloud 110 that represents the object 102. Examples of the sensors 104 include but are not necessarily limited to digital cameras, Light Detection and Ranging (lidar) devices or systems, 3-D laser scanning devices, time-of-flight (TOF) cameras or sensors, special phase imaging sensors, stereoscopic cameras, light field cameras, high resolutions cameras or sensors, or similar imaging devices. The sensors 104 may include any combination of these different types of sensors and the different types of sensors 104 may be arranged or placed in a predetermined array or arrays or groupings similar to that in the exemplary embodiment 400 described with reference to FIG. 4. Additionally, the plurality of sensors 104 and structured light projectors 120 are placed or spaced a predetermined distance from the object 102 to avoid any interference from humans or equipment 142 used in manufacturing, assembly or other functions associated with the object 102.
In accordance with an embodiment, the plurality of sensors 104 include three-dimension (3-D) scanning sensors 114, TOF sensors 116 and polarization sensors 118 or other type scanning sensors. The exemplary system 100 also includes a plurality of structured light projectors 120 for illuminating a surface 124 of the object 102 facing the sensors 104 for a scanning operation. The 3-D scanning sensors 114 collect 3-D electronic images 106 of the object 102. The electronic images 106 include three-dimension (3-D) point cloud data 108 for live metrology of the object 102. The point cloud data 108 from each sensor 104 defines a point cloud 110 that represents the object 102.
The TOF sensors 116 collect images 106 including depth “D” or range data of points 122 or 3-D locations on a surface 124 of the object 102 from the sensors 104 during a scanning operation. The TOF sensors 116 determine the depth D or range by reflecting an infrared light beam on the object 102 and measuring a time from transmitting the light beam to return of the reflected light beam from the object 102. A point cloud 110 is generated by each TOF sensor 116 that includes the depth information of points 122 on the object 102. The point clouds 110 generated by the TOF sensors 116 define a depth map 126 for use in generating the reconstructed model 112 of the as-manufactured object 102 as described herein.
The polarization sensors 118 or other type scanning sensors collect polarization images 128 or other electronic scanning images that include vector information from which light is reflected from points 122 or 3-D locations on the object 102. The polarization images 128 from each polarization sensor 118 or other electronic scanning images from each scanning sensor define a point cloud 110 that includes measurements comprising 3-D shape information of the object 102 on a microscopic level used in generating the reconstructed model 112 of the as-manufactured object 102 as described herein.
The electronic images 106 include 3-D point cloud data 108 of the object 102. The point cloud data 108 from each sensor 104 defines a point cloud 110 that represents the object 102. Each point cloud 110 includes a multiplicity of points 132 and each point 132 includes at least location information for a corresponding point 122 on the surface 124 of the object 102. The point cloud data 108 or point clouds 110 from each sensor 104 are stored in a database 130 or other data storage device. In accordance with an embodiment, the object 102 is an aircraft or portion of an aircraft and the reconstructed model 112 of the as-manufactured object 102, as described herein, is used for 3-D live metrology of the aircraft or portion of the aircraft during assembly or manufacturing or for some other purpose. In other embodiments, the object 102 is any product or item or portion of a product or item and the reconstructed model 112, as described herein, is used to perform 3-D live metrology on the device or equipment during assembly, manufacturing or other operation.
The system 100 also includes a processor 134 and a live metrology module 136 operating on the processor 134. In accordance with an exemplary embodiment, the live metrology model is configured to perform the operations or functions described with reference to FIGS. 2 and 3. For example, the live metrology module 136 is configured to perform a set of functions including but not limited to stitching the point clouds 110 from the plurality of sensors 104 together to generate the reconstructed model 112 of the as-manufactured object 112 and to compare the reconstructed model 112 to an as-designed model 138 of the object 102 to determine that the object 102 is being manufactured within allowable tolerances 140 to the as-designed model 138 of the object 102. The reconstructed model 112 is generated near real-time to performing the scanning operation described with reference to FIGS. 2 and 3.
In accordance with an embodiment, the system 100 also includes an apparatus 144 for positioning the object 102 in different orientations or positions for scanning. For a large object 102, such as an aircraft or component of an aircraft, the object 102 may need to be positioned with different surfaces or sides facing the sensors 104 so that a reconstructed model 112 of the complete or entire as-manufactured object 102 can be generated or formed. In accordance with an embodiment, the apparatus 144 is also sized so that humans or equipment 142 used in manufacturing, assembly or other functions associated with the object 102 are movable with respect to the apparatus 144. The processor 134 controls operation of the 3-D scanning sensors 114, time-of-flight sensors 116, polarization sensors 118 or scanning sensors, and structured light projectors 120. In accordance with another embodiment, the processor 134 or another processor also controls apparatus 144 for positioning the object 102 for scanning.
FIG. 2 is a flow chart of an example of a method 200 for live metrology of an object in accordance with an embodiment of the present disclosure. In accordance with an embodiment, the method 200 is embodied in and performed by the system 100 in FIG. 1. In block 202, the sensors are calibrated. A known or standard object which has verified dimensions to an as-designed object may be placed in the system 100 to calibrate the sensors to confirm that they are operating properly and are collecting accurate measurement data. Calibration also collects parameters (intrinsic and extrinsic parameters) used by the live metrology module 136 to properly run. An accuracy of the reconstructed model of the as-manufactured object is preferably within about 0.003 inches to the actual manufactured object being measured or for which the reconstructed model is being generated.
In block 204, the object is positioned in a selected orientation for performing the scanning operation with a surface or side of the object to be scanned facing the sensors. In block 206, a scanning operation is performed by a plurality of sensors to collect electronic images of the object. The electronic images include 3-D point cloud data for live 3-D metrology of the object. The point cloud data from each sensor defines a point cloud that represents the object similar to that previously described. An example of a scanning operation useable in block 206 will be described with reference to FIG. 3. In block 208, the 3-D point cloud data is stored after performing the scanning operation.
In block 210, a determination is made whether another orientation of the object is to be selected for scanning another surface or side of the object. If another orientation is selected, the method 200 returns to block 204 and the object is positioned or repositioned in another selected orientation for scanning another surface or side of the object. The method 200 then continues similar to that previously described. This process continues until all desired surfaces or sides of the object have been scanned to provide a complete reconstructed model of the as-manufactured object. The point cloud data or 3-D point cloud data is stored after performing each scanning operation.
In block 212, the point clouds from the plurality of sensors for all scanning operations are stitched or fused together to generate the reconstructed model of the as-manufactured object. Stitching the point clouds together includes stitching the point clouds from each of the selected orientations of the object together. In accordance with an embodiment, stitching or fusing the point clouds together includes generating a resolution adaptive mesh using the point clouds. The resolution adaptive mesh corresponds to the reconstructed model of the as-manufactured object for 3-D metrology of the object. An example of stitching or fusing the point clouds together by generating a resolution adaptive mesh is described in U.S. application Ser. No. 16/663,190, entitled “Resolution Adaptive Mesh for Performing 3-D Metrology of an Object” which is assigned to the same assignee as the present application and is incorporated herein by reference. This application discloses a method and system for representing and optimally combining point cloud data or 3-D point cloud data generated by a plurality of sensors or 3-D scanning system, similar to that described herein, and for measuring and tracking objects in large spatial volumes, such as for example, a factory during a manufacturing or assembly process of an object, such as an aircraft. The system and method converts point clouds into a surface representation useful for metrology while taking into account the spatial varying resolutions in different directions of the plurality of sensors observing the same object. The system and method does this by weighting the contribution of each point to the surface representation differently depending on the expected accuracy of the point in the point cloud. This is in contrast to existing surface reconstruction methods that weight all points equally and don't incorporate sensor resolution models or prior knowledge about the expected resolution of the sensors as a function of viewing direction and distance from the object. The system and method optimizes the 3-D surface representation of the object or reconstructed model of the as-manufactured object derived from stitching or fusion of the point clouds from the plurality of sensors observing the same object but from different directions and distances.
In accordance with another embodiment, stitching the point clouds together includes fitting a mesh to the point clouds using an intermediate implicit representation of each point cloud. An example of stitching or fusing point clouds together using an intermediate implicit representation of each point cloud is described in U.S. application Ser. No. 15/663,243, entitled “Resolution Adaptive Mesh That Is Generated Using an Intermediate Implicit Representation of a Point Cloud,” which is assigned to the same assignee as the present application and is incorporated herein by reference. This application describes a system and method for representing and optimally combining point cloud data or 3-D point cloud data generated by a plurality of sensors or 3-D scanning system, similar to that described herein that measures and tracks objects in large spatial volumes such as a factory floor during an assembly or manufacturing process of such an object, such as an aircraft. The method or system converts point clouds into a surface representation useful for metrology while taking into account the spatial varying resolutions in different directions of the plurality of sensors or 3-D scanning system observing the same object. The method or system does this by converting the point cloud or point clouds into an optimized point cloud for mesh fitting using an intermediate implicit representation of the point cloud or point clouds that takes into consideration resolution of each of the sensors of the 3-D scanning system. The method or system optimizes the 3-D surface representation of the object or reconstructed model of the as-manufactured object derived from stitching or fusion of the point clouds from the plurality of sensors observing the same object but from different directions and distances.
In block 214, the reconstructed model of the as-manufactured object is compared to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.
In block 216, any deviation between the as-manufactured object and the as-designed object is determined using the reconstructed model of the as-manufactured object. The reconstructed model is used for 3-D metrology of the object similar to that described in U.S. application Ser. No. 15/663,190, entitled “Resolution Adaptive Mesh for Performing 3-D Metrology of an Object” and U.S. application Ser. No. 15/663,243, entitled “Resolution Adaptive Mesh That Is Generated Using an Intermediate Implicit Representation of a Point Cloud.” An operation may be performed on the object, such as a manufacturing operation using the reconstructed model. For example, in accordance with an embodiment, the reconstructed model of the as-manufactured object is used as an input to a machine controller of a machine to avoid inadvertent contact by the machine.
FIG. 3 is flow chart of an example of a method 300 for performing a scanning operation to generate a surface representation of an object in accordance with an embodiment of the present disclosure. In accordance with an embodiment, the method 300 is used for performing the scanning operation in block 206 in FIG. 2. In accordance with an embodiment, the method 300 is embodied in and performed by the system 100 in FIG. 1. In block 302, a depth map is created using a group of time-of-flight sensors of the plurality of sensors. The depth map includes a point cloud of point cloud data including distances or ranges from the time-of-flight sensors to the object.
In block 304, a plurality of scanning regions are illuminated using the depth map. In accordance with the exemplary embodiment in FIG. 1, the scanning regions are illuminated by the structured light projectors 120.
In block 306, the object is scanned using a multiplicity of polarization sensors or other scanning sensors of the plurality of sensors using the depth map to collect a multiplicity of polarization images. A plurality of scanning areas on the object are determined using the depth map. Each polarization image defines a polarization point cloud generated by each polarization sensor. In accordance with an embodiment, the polarization or scanning sensors are mounted on an elevator for movement relative to the object for scanning. Elevator zones associated with the elevators are assigned by square footage to keep the areas defined by the elevator zones clear of objects during a scanning operation. Sensors are also mountable on pan/tilt units for performing scanning operations and areas associated with these are also kept clear of objects during scanning.
In block 308, the polarization images are stitched or fused together to form a complete, conjoined or fused polarization sensor map. In accordance with an embodiment, the polarization images are stitched or fused together using techniques described in U.S. application Ser. No. 15/663,190 or U.S. application Ser. No. 15/663,243. Stitching the polarization images together includes generating a representation of the object from the polarization point clouds. The representation of the object includes a resolution adaptive mesh.
In block 310, a plurality of surface images of the object are captured or collected using a plurality of 3-D scanning sensors of the plurality of sensors. Each surface image defines a surface image point cloud generated by each of the 3-D scanning sensors of a surface or side of the object facing the 3-D scanning sensors.
In block 312, the surface images are mapped to the conjoined or complete polarization sensor map to generate a surface representation of the object of the surface or side of the object facing the plurality of sensors. In accordance with an embodiment, the point clouds defining the surface images of the object and the polarization images are mapped or fused using techniques described in U.S. application Ser. No. 15/663,190 or U.S. application Ser. No. 15/663,243. Mapping the surface images to the conjoined or complete polarization sensor map includes generating a surface representation of the object using the polarization point clouds and the surface image point clouds. The surface representation of the object includes a resolution adaptive mesh corresponding to the object for 3-D live metrology of the object. In accordance with an embodiment, generating the surface representation includes fitting a mesh to the polarization point clouds and the surface image point clouds using an intermediate implicit representation of each point cloud.
FIG. 4 is an illustration of an example of a system 400 for performing live metrology of an object 404 in accordance with another embodiment of the present disclosure. In accordance with an exemplary embodiment the system 400 is used for the system 100 in FIG. 1 or is part of the system 100 in FIG. 1. The system 400 includes an apparatus 402 for positioning an object 404 for scanning. The apparatus 402 may be used for the apparatus 144 in FIG. 1. The apparatus 402 for positioning the object 404 includes a base 406 and a pair of opposite side walls 408 and 410 extending from the base 406. A gantry 412 extends between the tops of the sides walls 408 and 410 and moves along a track 414 on each side wall 408 and 410. The gantry 412 includes a crane 416 for positioning or repositioning the object 404 for scanning different surfaces or sides of the object 404 as described herein.
The system 400 also includes a mounting arrangement 418 for supporting a plurality of sensors 420 and lighting projectors 422 for performing scanning operations. In accordance with an embodiment, the mounting arrangement 418 includes a plurality of trusses 424 or a ladder type structure including a plurality of horizontal structural members 426 for mounting the sensors 420 and lighting projectors 422 extending between vertical structural members 428. Similar to that previously described, in accordance with an embodiment, the sensors 420 include different types of sensors or cameras for collecting different types of data for generating a reconstructed model of an as-manufactured object. The sensors 420 are placed in predetermined arrays or groups based on a particular scanning operation and/or object being scanned. For example, a first horizontal structural member 426 a includes a plurality of arrays of pan/tilt units 430 for scanning the object 404 and collecting electronic images including data for generating the reconstructed model of the as-manufactured object 404. Examples of the pan/tilt units 430 include but are not limited to 3-D scanning sensors or devices. A second horizontal structural member 426 b includes a plurality of arrays of 3-D scanning sensors 432 or cameras. A third horizontal structural member 426 c includes a plurality of arrays of time-of-flight sensors 434 and a fourth horizontal structural member 426 d includes a plurality of lighting projectors 422 or infrared lighting projectors. The exemplary embodiment in FIG. 4 illustrates the mounting arrangement 418 and sensors 420 on one side of the system 400 or associated with one side wall 408. In another embodiment, a mounting arrangement 418 and sensors 420 are on both sides of the system 400, or are associated with both side walls 408 and 410.
In accordance with an embodiment, either of the methods 200 and 300 are embodied on a computer program product, such as computer program product 146 in FIG. 1. The computer program product 146 includes a computer readable storage medium similar to that previously described having computer program instructions 148 embodied therewith. The computer readable storage medium is not a transitory medium per se. The program instructions are executable by a device, such as processor 134 in FIG. 1 to cause the device to perform the method 200 or 300. In accordance with an embodiment, the computer program instructions 148 define the live metrology module 136 which is stored on a storage device in association with the processor 134 and is downloadable from the computer program product 146.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of embodiments of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present embodiments has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to embodiments in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of embodiments.
Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art appreciate that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown and that the embodiments have other applications in other environments. This application is intended to cover any adaptations or variations. The following claims are in no way intended to limit the scope of embodiments of the disclosure to the specific embodiments described herein.

Claims (20)

What is claimed is:
1. A method for live metrology of an object, comprising:
performing a scanning operation by a plurality of different types of sensors to collect electronic images of an object for each of one or more selected orientations of the object relative to the plurality of sensors, the electronic images comprising 3-D point cloud data for live metrology of the object and the point cloud data from each sensor defining a point cloud that represents the object, wherein performing the scanning operation comprises:
creating a depth map using a group of time-of-flight sensors of the plurality of sensors, wherein the depth map comprises a point cloud of point cloud data including distances from the time-of-flight sensors to the object;
scanning the object using a multiplicity of polarization sensors of the plurality of sensors using the depth map to collect a multiplicity of polarization images, wherein each polarization image includes vector information from which light is reflected from points or 3-D locations on the object;
stitching the polarization images together to form a conjoined polarization sensor map;
capturing a plurality of surface images of the object using a plurality of 3-D scanning sensors of the plurality of sensors, wherein each surface image defines a surface image point cloud generated by each of the 3-D scanning sensors of a surface or side of the object facing the 3-D scanning sensors; and
mapping the surface images to the conjoined polarization sensor map to generate a surface representation of the object of the surface or side of the object facing the plurality of sensors; and
stitching the point clouds from each selected orientation of the object together to generate a reconstructed model of an as-manufactured object.
2. The method of claim 1, further comprising placing the plurality of sensors in a predetermined array or arrays of different types of sensors.
3. The method of claim 1, further comprising placing the plurality of sensors a predetermined distance from the object to avoid interference from humans and equipment during manufacturing.
4. The method of claim 1, further comprising:
positioning the object in a selected orientation for performing the scanning operation with a surface or side of the object to be scanned facing the sensors;
storing the 3-D point cloud data after performing the scanning operation;
repositioning the object in other selected orientations for scanning other surfaces or sides of the object; and
storing the 3-D point cloud data after performing each scanning operation, wherein stitching the point clouds together comprises stitching the point clouds from each of the selected orientations of the object together.
5. The method of claim 1, further comprising illuminating a plurality of scanning regions using a structured light projector.
6. The method of claim 1, further comprising illuminating a plurality of scanning regions using the depth map.
7. The method of claim 1, further comprising determining a plurality of scanning areas using the depth map.
8. The method of claim 1, wherein stitching the polarization images together comprises generating a representation of the object from the polarization images, the representation of the object comprising a resolution adaptive mesh.
9. The method of claim 1, wherein mapping the surface images to the conjoined polarization sensor map comprises generating a representation of the object using the polarization images and the surface image point clouds, the surface representation of the object comprising a resolution adaptive mesh corresponding to the object for 3-D metrology of the object.
10. The method of claim 9, wherein generating the surface representation comprises fitting a mesh to the polarization images and the surface image point clouds using an intermediate implicit representation of each point cloud.
11. The method of claim 1, wherein stitching the point clouds together comprises generating a resolution adaptive mesh using the point clouds, the resolution adaptive mesh corresponding to the reconstructed model of the as-manufactured object for 3-D metrology of the object.
12. The method of claim 1, wherein stitching the point clouds together comprises fitting a mesh to the point clouds using an intermediate implicit representation of each point cloud.
13. The method of claim 1, wherein an accuracy of the reconstructed model of the as-manufactured object is within about 0.003 inches to an actual manufactured object.
14. The method of claim 1, further comprising providing the reconstructed model of the as-manufactured object as an input to a machine controller of a machine to avoid inadvertent contact by the machine.
15. The method of claim 1, further comprising comparing the reconstructed model of the as-manufactured object to an as-designed model of the object to determine that the object is manufactured within an allowable tolerance to the as-designed model of the object.
16. The method of claim 1, further comprising performing an operation on the object using the reconstructed model.
17. A system for live metrology of an object, comprising:
a plurality of different types of sensors for performing a scanning operation to collect electronic images of an object, the electronic images comprising 3-D point cloud data for live metrology of the object and the point cloud data from each sensor defining a point cloud that represents the object, wherein performing the scanning operation comprises:
creating a depth map using a group of time-of-flight sensors of the plurality of sensors, wherein the depth map comprises a point cloud of point cloud data including distances from the time-of-flight sensors to the object;
scanning the object using a multiplicity of polarization sensors of the plurality of sensors using the depth map to collect a multiplicity of polarization images, wherein each polarization image includes vector information from which light is reflected from points or 3-D locations on the object;
stitching the polarization images together to form a conjoined polarization sensor map;
capturing a plurality of surface images of the object using a plurality of 3-D scanning sensors of the plurality of sensors, wherein each surface image defines a surface image point cloud generated by each of the 3-D scanning sensors of a surface or side of the object facing the 3-D scanning sensors; and
mapping the surface images to the conjoined polarization sensor map to generate a surface representation of the object of the surface or side of the object facing the plurality of sensors; wherein the system further comprises
a processor; and
a live metrology module operating on the processor, the live metrology module being configured to perform a set of functions comprising:
stitching the point clouds from one or more scanning operations of one or more selected orientations of the object together to generate a reconstructed model of an as-manufactured object.
18. The system of claim 17, wherein the plurality of sensors comprise one or more of an array of Light Detection and Ranging (LIDAR) devices or system, an array of special phase imaging sensors, an array of time-of-flight sensors or cameras, an array of stereoscopic cameras, an array of light field cameras, and an array of high resolution cameras.
19. The system of claim 17, wherein the reconstructed model of the as-manufactured object is generated near real-time to performing the scanning operation.
20. A computer program product for live metrology of an object, the computer program product comprising a computer readable storage medium having program instructions embodied therewith, wherein the computer readable storage medium is not a transitory medium per se, the program instructions being executable by a device to cause the device to perform a method comprising:
performing a scanning operation by a plurality of different types of sensors to collect electronic images of an object for each of one or more selected orientations of the object relative to the plurality of sensors, the electronic images comprising 3-D point cloud data for live metrology of the object and the point cloud data from each sensor defining a point cloud that represents the object, wherein performing the scanning operation comprises:
creating a depth map using a group of time-of-flight sensors of the plurality of sensors, wherein the depth map comprises a point cloud of point cloud data including distances from the time-of-flight sensors to the object;
scanning the object using a multiplicity of polarization sensors of the plurality of sensors using the depth map to collect a multiplicity of polarization images, wherein each polarization image includes vector information from which light is reflected from points or 3-D locations on the object;
stitching the polarization images together to form a conjoined polarization sensor map;
capturing a plurality of surface images of the object using a plurality of 3-D scanning sensors of the plurality of sensors, wherein each surface image defines a surface image point cloud generated by each of the 3-D scanning sensors of a surface or side of the object facing the 3-D scanning sensors; and
mapping the surface images to the conjoined polarization sensor map to generate a surface representation of the object of the surface or side of the object facing the plurality of sensors; and
stitching the point clouds from each selected orientation of the object together to generate a reconstructed model of an as-manufactured object.
US15/663,397 2017-07-28 2017-07-28 Live metrology of an object during manufacturing or other operations Active 2038-10-29 US10732284B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US15/663,397 US10732284B2 (en) 2017-07-28 2017-07-28 Live metrology of an object during manufacturing or other operations
ES18173753T ES2757561T3 (en) 2017-07-28 2018-05-23 Live metrology of an object during manufacturing or other operations
EP18173753.7A EP3435028B1 (en) 2017-07-28 2018-05-23 Live metrology of an object during manufacturing or other operations
KR1020180071265A KR102643295B1 (en) 2017-07-28 2018-06-21 Live metrology of an object during manufacturing or other operations
JP2018136443A JP7294778B2 (en) 2017-07-28 2018-07-20 Live measurement of objects during manufacturing or other operations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/663,397 US10732284B2 (en) 2017-07-28 2017-07-28 Live metrology of an object during manufacturing or other operations

Publications (2)

Publication Number Publication Date
US20190033461A1 US20190033461A1 (en) 2019-01-31
US10732284B2 true US10732284B2 (en) 2020-08-04

Family

ID=62597302

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/663,397 Active 2038-10-29 US10732284B2 (en) 2017-07-28 2017-07-28 Live metrology of an object during manufacturing or other operations

Country Status (5)

Country Link
US (1) US10732284B2 (en)
EP (1) EP3435028B1 (en)
JP (1) JP7294778B2 (en)
KR (1) KR102643295B1 (en)
ES (1) ES2757561T3 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3480626A1 (en) * 2017-11-02 2019-05-08 Koninklijke Philips N.V. Improved depth image reconstruction
CA3157194C (en) 2019-10-07 2023-08-29 Boston Polarimetrics, Inc. Systems and methods for augmentation of sensor systems and imaging systems with polarization
CN112686949A (en) * 2020-12-30 2021-04-20 深圳艾灵网络有限公司 Vehicle positioning method, system and related equipment
CN113190772B (en) * 2021-03-31 2024-11-08 广州朗国电子科技股份有限公司 A method, system and storage medium for information release and feedback of smart cloud screen
EP4198449A1 (en) * 2021-12-14 2023-06-21 Hexagon Technology Center GmbH Metrology system
CN116540255A (en) * 2022-01-26 2023-08-04 上海飞机制造有限公司 System and method for measuring and obtaining plane shape by using multiple laser radars
US11875457B2 (en) * 2022-02-22 2024-01-16 Zebra Technologies Corporation 3D product reconstruction from multiple images collected at checkout lanes

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020167518A1 (en) 1996-10-16 2002-11-14 Alexander Migdal System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities
US20050140670A1 (en) 2003-11-20 2005-06-30 Hong Wu Photogrammetric reconstruction of free-form objects with curvilinear structures
US20060142971A1 (en) 2004-12-08 2006-06-29 David Reich All surface data for use in substrate inspection
US20090160852A1 (en) * 2007-12-19 2009-06-25 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. System and method for measuring a three-dimensional object
US20110316978A1 (en) 2009-02-25 2011-12-29 Dimensional Photonics International, Inc. Intensity and color display for a three-dimensional metrology system
US20120301013A1 (en) 2005-01-07 2012-11-29 Qualcomm Incorporated Enhanced object reconstruction
FR2992762A1 (en) 2012-06-29 2014-01-03 Real Fusio France METHOD FOR GENERATING A MESH OF AT LEAST ONE OBJECT IN THREE DIMENSIONS
US8983794B1 (en) 2010-10-04 2015-03-17 The Boeing Company Methods and systems for non-destructive composite evaluation and repair verification
US20150213646A1 (en) 2014-01-28 2015-07-30 Siemens Aktiengesellschaft Method and System for Constructing Personalized Avatars Using a Parameterized Deformable Mesh
US20150294036A1 (en) 2014-04-10 2015-10-15 Dassault Systemes Fitting sample points with an isovalue surface
US20150363972A1 (en) 2013-01-21 2015-12-17 Saab Vricon Systems Ab A method and arrangement for providing a 3d model
US20160261844A1 (en) 2015-03-06 2016-09-08 Massachusetts Institute Of Technology Methods and Apparatus for Enhancing Depth Maps with Polarization Cues
US20170004649A1 (en) 2015-06-30 2017-01-05 Alvaro Collet Romea Mixed three dimensional scene reconstruction from plural surface models
US20170010087A1 (en) 2015-07-07 2017-01-12 Quality Vision International, Inc. Method and apparatus for scanning object
US20170053438A1 (en) 2014-06-13 2017-02-23 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Method and system for reconstructing a three-dimensional model of point clouds
US20170193699A1 (en) 2015-12-31 2017-07-06 Dassault Systemes Reconstructing A 3D Modeled Object

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009040991B4 (en) * 2009-09-10 2012-11-08 Carl Zeiss Ag Measuring arrangement and method for measuring a surface
JP2013186100A (en) * 2012-03-12 2013-09-19 Hitachi Ltd Shape inspection method and device
WO2016194728A1 (en) * 2015-06-01 2016-12-08 新日鐵住金株式会社 Method and device for inspection of crankshaft

Patent Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6611267B2 (en) * 1996-10-16 2003-08-26 Viewpoint Corporation System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities
US20020167518A1 (en) 1996-10-16 2002-11-14 Alexander Migdal System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities
US20050140670A1 (en) 2003-11-20 2005-06-30 Hong Wu Photogrammetric reconstruction of free-form objects with curvilinear structures
US20060142971A1 (en) 2004-12-08 2006-06-29 David Reich All surface data for use in substrate inspection
US7593565B2 (en) * 2004-12-08 2009-09-22 Rudolph Technologies, Inc. All surface data for use in substrate inspection
US20120301013A1 (en) 2005-01-07 2012-11-29 Qualcomm Incorporated Enhanced object reconstruction
US9234749B2 (en) * 2005-01-07 2016-01-12 Qualcomm Incorporated Enhanced object reconstruction
US8072450B2 (en) 2007-12-19 2011-12-06 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. System and method for measuring a three-dimensional object
US20090160852A1 (en) * 2007-12-19 2009-06-25 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. System and method for measuring a three-dimensional object
US20110316978A1 (en) 2009-02-25 2011-12-29 Dimensional Photonics International, Inc. Intensity and color display for a three-dimensional metrology system
US8983794B1 (en) 2010-10-04 2015-03-17 The Boeing Company Methods and systems for non-destructive composite evaluation and repair verification
FR2992762A1 (en) 2012-06-29 2014-01-03 Real Fusio France METHOD FOR GENERATING A MESH OF AT LEAST ONE OBJECT IN THREE DIMENSIONS
US20150363972A1 (en) 2013-01-21 2015-12-17 Saab Vricon Systems Ab A method and arrangement for providing a 3d model
US20150213646A1 (en) 2014-01-28 2015-07-30 Siemens Aktiengesellschaft Method and System for Constructing Personalized Avatars Using a Parameterized Deformable Mesh
US9524582B2 (en) * 2014-01-28 2016-12-20 Siemens Healthcare Gmbh Method and system for constructing personalized avatars using a parameterized deformable mesh
US20150294036A1 (en) 2014-04-10 2015-10-15 Dassault Systemes Fitting sample points with an isovalue surface
US9928314B2 (en) * 2014-04-10 2018-03-27 Dassault Systemes Fitting sample points with an isovalue surface
US20170053438A1 (en) 2014-06-13 2017-02-23 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Method and system for reconstructing a three-dimensional model of point clouds
US10062207B2 (en) * 2014-06-13 2018-08-28 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Method and system for reconstructing a three-dimensional model of point clouds
US20160261844A1 (en) 2015-03-06 2016-09-08 Massachusetts Institute Of Technology Methods and Apparatus for Enhancing Depth Maps with Polarization Cues
US10260866B2 (en) * 2015-03-06 2019-04-16 Massachusetts Institute Of Technology Methods and apparatus for enhancing depth maps with polarization cues
US20170004649A1 (en) 2015-06-30 2017-01-05 Alvaro Collet Romea Mixed three dimensional scene reconstruction from plural surface models
US9646410B2 (en) * 2015-06-30 2017-05-09 Microsoft Technology Licensing, Llc Mixed three dimensional scene reconstruction from plural surface models
US20170010087A1 (en) 2015-07-07 2017-01-12 Quality Vision International, Inc. Method and apparatus for scanning object
US20170193699A1 (en) 2015-12-31 2017-07-06 Dassault Systemes Reconstructing A 3D Modeled Object

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Eiben, A.E., et al.; "Genetic algorithms with multi-parent recombination," PPSN III: Proceedings of the International Conference on Evolutionary Computation, The Third Conference on Parallel Problem Solving from Nature, 1994, pp. 78-87.
European Patent Office; European Search Report for European Application No. 18174588 dated Aug. 15, 2018, 4 Pages.
European Patent Office; Office Action for European Application No. 18173753.7 dated Aug. 30, 2018, 11 Pages.
European Patent Office; Office Action for European Application No. 18174588.6 dated Sep. 10, 2018, 8 Pages.
Kadambi, Achuta, et al.; "Polarized 3D: High-Quality Depth Sensing with Polarization Cues," 2015 IEEE International Conference on Computer Vision, 2015, pp. 3370-3378.
Kennedy, James, et al.; "Particle Swarm Optimization," Proceedings of IEEE International Conference on Neural Networks, IV, 1995, pp. 1942-1948.
Mahdaoui, Abdelaaziz, et al.; "Comparative Study of Combinatorial 3D Reconstruction Algorithms," International Journal of Engineering Trends and Technology, 2017, pp. 247-251, vol. 48.
Powell, M.J.D.; "An efficient method for finding the minimum of a function of several variables without calculating derivatives," Computer Journal, 7(2), 1964, pp. 155-162.

Also Published As

Publication number Publication date
KR102643295B1 (en) 2024-03-04
ES2757561T3 (en) 2020-04-29
US20190033461A1 (en) 2019-01-31
JP7294778B2 (en) 2023-06-20
EP3435028B1 (en) 2019-08-21
KR20190013467A (en) 2019-02-11
JP2019039909A (en) 2019-03-14
EP3435028A1 (en) 2019-01-30

Similar Documents

Publication Publication Date Title
US10732284B2 (en) Live metrology of an object during manufacturing or other operations
Soudarissanane et al. Optimizing terrestrial laser scanning measurement set-up
CN107702662B (en) Reverse monitoring method and system based on laser scanner and BIM
US9869755B2 (en) Laser scanner and method of registering a scene
EP3754363A1 (en) Method and apparatus for registering three-dimensional point clouds
JP2016060610A (en) Elevator hoistway internal dimension measuring device, elevator hoistway internal dimension measuring controller, and elevator hoistway internal dimension measuring method
Cho et al. Target-focused local workspace modeling for construction automation applications
US9857232B2 (en) Device for non-contact temperature measurement and temperature measurement method
JP7339629B2 (en) Spatial curve co-locating projection system using multiple laser galvo scanners and method thereof
CN117146710B (en) Dynamic projection 3D reconstruction system and method based on active vision
US11692812B2 (en) System and method for measuring three-dimensional coordinates
Sommer et al. Scan methods and tools for reconstruction of built environments as basis for digital twins
Mader et al. An integrated flexible self-calibration approach for 2D laser scanning range finders applied to the Hokuyo UTM-30LX-EW
JPWO2017199785A1 (en) Monitoring system setting method and monitoring system
US11941793B2 (en) Artificial intelligence based registration support for environmental scans
US11953310B2 (en) Method for measuring gap and flush of vehicle parts and measuring tunnel
CN114935748A (en) Large-baseline multi-laser-radar calibration method and system based on detected object
Huang et al. Extrinsic calibration of a multi-beam LiDAR system with improved intrinsic laser parameters using v-shaped planes and infrared images
Al-Durgham et al. Bundle adjustment-based stability analysis method with a case study of a dual fluoroscopy imaging system
GB2543658A (en) Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
EP4094096A1 (en) A method and system for generating a colored tridimensional map
Gražulis et al. The horizontal deformation analysis of high-rise buildings
Meissner et al. Simulation and calibration of infrastructure based laser scanner networks at intersections
CN119205893B (en) Pose determining method and device based on image and point cloud, working machine and medium
Sommer et al. as Basis for Digital Twins

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE BOEING COMPANY, ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WINGERT, LIAM ANTONIO;CANTRELL, CHRIS A.;BAKER, ANTHONY W.;AND OTHERS;SIGNING DATES FROM 20170720 TO 20170728;REEL/FRAME:043134/0397

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4