WO2021193347A1 - Data generation system, data generation method, data generation device, and additional learning requirement assessment device - Google Patents
Data generation system, data generation method, data generation device, and additional learning requirement assessment device Download PDFInfo
- Publication number
- WO2021193347A1 WO2021193347A1 PCT/JP2021/011062 JP2021011062W WO2021193347A1 WO 2021193347 A1 WO2021193347 A1 WO 2021193347A1 JP 2021011062 W JP2021011062 W JP 2021011062W WO 2021193347 A1 WO2021193347 A1 WO 2021193347A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- defective
- local shape
- image
- data generation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 71
- 238000013075 data extraction Methods 0.000 claims abstract description 26
- 239000000284 extract Substances 0.000 claims abstract description 16
- 230000002950 deficient Effects 0.000 claims description 320
- 239000000463 material Substances 0.000 claims description 19
- 238000000605 extraction Methods 0.000 claims description 14
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000012549 training Methods 0.000 claims description 6
- 239000011324 bead Substances 0.000 description 84
- 238000003466 welding Methods 0.000 description 80
- 238000010586 diagram Methods 0.000 description 35
- 238000012545 processing Methods 0.000 description 25
- 238000013500 data storage Methods 0.000 description 21
- 238000007689 inspection Methods 0.000 description 21
- 230000008569 process Effects 0.000 description 20
- 238000013473 artificial intelligence Methods 0.000 description 17
- 239000002184 metal Substances 0.000 description 13
- 229910052751 metal Inorganic materials 0.000 description 13
- 230000000295 complement effect Effects 0.000 description 9
- 230000007547 defect Effects 0.000 description 7
- 239000011159 matrix material Substances 0.000 description 5
- 238000000465 moulding Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 229910000677 High-carbon steel Inorganic materials 0.000 description 3
- 229910001209 Low-carbon steel Inorganic materials 0.000 description 3
- 239000002131 composite material Substances 0.000 description 3
- 238000001746 injection moulding Methods 0.000 description 3
- 150000002739 metals Chemical class 0.000 description 3
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 2
- 229910000975 Carbon steel Inorganic materials 0.000 description 2
- 229910052799 carbon Inorganic materials 0.000 description 2
- 239000010962 carbon steel Substances 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000010422 painting Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present disclosure generally relates to a data generation system for learning data, a data generation method, a data generation device, and an additional learning necessity device having a data generation device.
- design data having three-dimensional information or mesh data having three-dimensional information imitating parts is rendered from many directions to generate a large number of images, and a learning target area is recognized in the generated images.
- a method of adding a label of a recognized learning target area to each image, further performing extended processing to obtain training data, modeling using the generated training data, and generating a classifier is disclosed.
- the data generation system of the present disclosure includes a data extraction unit and a data extension unit.
- the data extraction unit extracts the first data regarding the defective local shape from the data of the object having the defective local shape.
- the data extension unit combines the first data with the second data of an object having no defective local shape.
- the data extraction unit extracts the first data based on the data of the object having the defective local shape and the position information data of the defective local shape.
- the learning data of the present disclosure has an advantage of improving the determination accuracy of the learning device that determines the class assigned to the object.
- FIG. 5 is a schematic diagram showing another example of image data included in defective product data generated by adding an additional image to the image data shown in FIG. 5A.
- FIG. 5 is a block diagram which shows the generation system of the non-defective product data which concerns on the 2nd Embodiment of this disclosure. It is a figure which shows the image in the vicinity of the bead of the good product sample which concerns on 3rd Embodiment of this disclosure. It is a figure which shows the enlarged image near the bead of the good product sample.
- It is a block diagram of the 1st means to obtain the defect local part data concerning this embodiment.
- It is a block diagram of the 2nd means which obtains the defect local part data concerning this embodiment.
- the learning data according to the first embodiment according to the present disclosure is the data that is the basis for determining the class assigned to the object.
- the "class” here is a classification such as a non-defective product and a defective product, and may be three or more classes such as a non-defective product, an adaptive product, and a defective product.
- the target for determining the class using the learning data is, for example, as shown in FIG. 1, when two or more members (here, the first plate B11 and the second plate B12) are welded, the welded portion.
- the learning device outputs, as an estimation result, whether the bead B1 is a non-defective product or a defective product, or if it is a defective product, the type of the defective product. That is, the learner is used for a weld appearance inspection to inspect whether the bead B1 is a non-defective product, in other words, whether or not welding is performed correctly.
- the bead B1 is a good product is determined by, for example, the length of the bead B1, the height of the bead B1, the rising angle of the bead B1, the throat thickness of the bead B1, the surplus of the bead B1, and the welding of the bead B1. It is determined by whether or not the numerical value such as the misalignment of the portion (including the misalignment of the start end of the bead B1) is within the permissible range. For example, if any one of the conditions listed above does not fall within the permissible range, the bead B1 is determined to be defective.
- whether or not the bead B1 is a non-defective product is determined by, for example, whether or not the bead B1 has an undercut B2 (see FIG. 2A) and the pit B3 of the bead B1 (see FIG. 2B). It is also determined based on the presence or absence of spatter B4 (see FIG. 2C) of the bead B1 and the presence or absence of protrusions of the bead B1. For example, if any one of the defective parts listed above occurs, the bead B1 is determined to be a defective product.
- CG Computer Graphics
- data expansion process refers to a process of artificially inflating training data by adding processing such as translation, enlargement / reduction, rotation, inversion, or addition of noise to a certain data. ..
- the problem is that the learning data obtained in this way is significantly different from the defective product data obtained in the actual inspection environment.
- the image data obtained from the sensors used in the inspection includes noise and virtual images. Therefore, depending on the model of the sensor used in the inspection and the noise removal processing method for the acquired image data, the learning device for class-determining the image to be inspected using the learning device generated using the learning data.
- the judgment accuracy will be different. In other words, in order to improve the judgment accuracy of the learner in the judgment of the class, it is necessary to generate the learning data in consideration of the difference in the inspection environment.
- FIG. 3 is a block diagram showing a defective product data 110 generation system 100 according to the first embodiment.
- a teacher object 120 having a locally defective shape hereinafter, referred to as a defective local shape
- the inspector photographs the teacher object 120 using the sensors used in the inspection.
- the creator of the teacher object 120 extracts the defective local shape data 180 of the teacher object based on the data about the teacher object 120 obtained by photographing and the information about the position and range of the defective local shape in the teacher object.
- the defective product data 110 that contributes to the improvement of the determination accuracy of the learner is generated.
- the defective product data 110 is generated from the data extraction unit 130 and the data expansion unit 140.
- the data extraction unit 130 includes the photographing unit 150, and generates the defective local shape data 180 based on the teacher object data 160 obtained from the photographing unit 150 and the position information data 170 of the defective local shape of the teacher object 120.
- the teacher object 120 is an object determined by the learner (or, in other words, an object to be recognized).
- the teacher object 120 includes, for example, a bead B1 formed at a welded portion.
- the material of the teacher object 120 is metal, but in another embodiment, the teacher object 120 may be made of, for example, resin, and the material is not limited. Further, the teacher object 120 may be made of the same material, or may be made of a plurality of different materials. For example, a composite material of low carbon steel and high carbon steel may be used.
- the teacher object 120 has a defective local shape.
- 2A to 2C are views showing an example of a cross section of the bead B1 having a defective local shape.
- the defective local shape is, for example, undercut B2, pit B3, spatter B4, etc. generated during welding.
- the defective local shape is not limited to the above-mentioned shape, and may be, for example, a protrusion or melt-off.
- a plurality of defective local shapes may be present in one teacher object 120, or a plurality of types of defective local shapes may be included.
- the defective local shape may be created by imitating the defective local shape of the object to be inspected, or may be created by design. For example, when the object to be inspected is created by welding, the defective local shape of the teacher object 120 imitates the defective local shape of the defective product that occurs in the actual inspection environment by welding under the welding conditions where defects are likely to occur. Can be created.
- the photographing unit 150 photographs the teacher object 120 and generates image information (teacher object data 160).
- the teacher object 120 having a defective local shape is provided to the inspection environment.
- the teacher object 120 is photographed using the imaging unit 150 used in the inspection.
- the teacher object data 160 which is the image information regarding the shape of the teacher object 120, is acquired.
- the image information may be three-dimensional image information or two-dimensional image information.
- the teacher object data 160 may be three-dimensional point cloud data.
- the device used for shooting may be any device that can acquire image information, regardless of the type and performance of the device.
- the photographing unit 150 is an imaging device
- the present invention is not limited to this.
- it may be configured to accept the input of data such as an image captured in advance by the imaging device used in the inspection.
- the photographing unit 150 can be referred to as a photographing data receiving unit.
- the teacher object 120 is photographed under a plurality of imaging conditions.
- the teacher object 120 is photographed at a plurality of photographing angles and a plurality of photographing positions. This makes it possible to efficiently collect information on the shape of one teacher object.
- the creator or provider of the teacher object 120 owns the position information data 170 of the defective local shape of the teacher object 120, which is information regarding the position and range of the defective local shape of the teacher object 120.
- the position information data 170 of the defective local shape is determined, for example, by the creator or provider of the teacher object 120. This will be described with reference to FIG. FIG. 4 is a schematic diagram showing an example of image data included in the teacher object 120 in the defective product data 110 generation system 100.
- the image data has a protrusion C1 as a defective local shape.
- the creator or provider of the teacher object 120 defines the protrusion C1 in FIG. 4 as a defective local shape, and stores information on the position and range of the defective local shape as position information data 170 of the defective local shape.
- the data extraction unit 130 extracts the defective local shape data 180 from which the information related to the defective local shape is extracted from the teacher object data 160 based on the position information data 170 of the defective local shape.
- the defective local shape data 180 is image information unique to each inspection environment. In other words, even if the same teacher object 120 is used, the obtained defective local shape data 180 differs depending on the inspection environment.
- the defective local shape data 180 is image data related to the defective local shape, and can also be described as the first data.
- the non-defective product data 190 is image data related to an object determined by the learner (or, in other words, an object to be recognized) without a defective local shape, and can also be described as a second data.
- the non-defective product data 190 is assumed to be data obtained in the inspection environment, but may be data obtained outside the inspection environment.
- the data expansion unit 140 generates defective product data 110 by combining the above-mentioned defective local shape data 180 and non-defective product data 190.
- This "combination” may be one in which the defective local shape data 180 and the non-defective product data 190 are added (combined), or the defective local shape data 180 is subjected to data expansion processing and then added (combined). It may be a thing.
- the data expansion process is performed by selectively using the parameters of the data expansion process.
- the "data expansion processing" referred to in the present disclosure is newly learned data based on the parameters of the data expansion processing without using the defective local shape data 180. May include processing to generate.
- the data expansion process may include a process of generating image data including a good bead B1 or an image data including a defective bead B1 by CG technology without using the first data.
- the data related to the defective shape may be expanded to the non-defective product data, or the data related to the shape other than the defective shape may be expanded to the non-defective product data.
- the "parameter of data expansion processing" referred to in the present disclosure is data expansion processing such as translation, enlargement / reduction, rotation, inversion, or addition of noise, which is executed for a part or all of the data to be processed.
- the degree of For example, when the image data of the defective bead B1 having protrusions on the surface is used as the data to be processed, the parameters of the data expansion processing may include the amount of movement to move the protrusions, the size of the protrusions, the amount of rotation of the protrusions, and the like. ..
- the parameters of the data expansion process are set in a range that can be changed for each type of process. For example, when the parameter is the movement amount for moving the protrusion, the movement amount can be changed in the range of 0 to several tens of mm.
- the parameter of the data expansion process may be one value.
- one or more parameters are appropriately selected from the parameters such as aspect ratio, shape, angle, bottom shape, depth, and smoothness ratio. It can be selected and used.
- a range that can be changed is set for each of the plurality of types of parameters.
- one or more parameters are selected from a plurality of types of parameters, and the data expansion processing is sequentially performed on the second data while changing the processing amount for each parameter within the changeable range. Will be executed.
- a large number of training data are generated based on one first data.
- a large number of defective product data 110 are generated by expanding the defective local shape data 180 obtained from one teacher object 120 with respect to the non-defective product data 190.
- a welded sample containing a defective local shape as shown in FIG. 4 is prepared.
- This welded sample is a defective product in which the protrusion C1 protrudes from the surface of the bead B1.
- the creator or provider of the weld sample possesses information about the location and extent of the protrusion C1 present in this weld sample.
- this weld sample is photographed and image information is acquired. Based on this image information and the information regarding the position and range of the protrusion C1, the image information regarding the shape of the protrusion C1 is extracted.
- the welding sample of the defective product having a new defective local shape is newly expanded.
- Image data is generated.
- a wide variety of defective product data 110 is generated from one teacher object 120. For example, it is possible to generate defective data 110 of a welding sample in which the size, position, angle and number of protrusions C1 are changed.
- non-defective product data 190 including image data as shown in FIG. 5A exists.
- This image data is data of a so-called non-defective welded sample having no defective local shape.
- the data extraction unit 130 described above has extracted information regarding the shape of the protrusion C1 on the surface of the teacher object of the bead B1.
- the data expansion process of adding the protrusion C1 to the non-defective bead B1 of FIG. 5A it is possible to generate the image data as shown in FIG. 5B.
- the position, size, angle, and the like of the protrusion C1 can be freely changed as described in the above-mentioned description regarding the parameters of the data expansion process.
- the image data shown in FIG. 5B is stored in the data expansion unit 140 as defective product data 110.
- the teacher object 120 by creating the teacher object 120 with a plurality of materials, it is possible to generate defective product data 110 of an intermediate material.
- the teacher object 120 is made of low carbon steel and high carbon steel, it is possible to generate defective product data 110 made of carbon steel having an intermediate carbon content. Thereby, various types of defective product data 110 can be efficiently generated from one teacher object 120.
- the defective local shape data 180 obtained from the teacher object 120 there are not only the protrusion C1 but also the pit D1 and the hole E1 as shown in FIG. 5C. It is also possible to select one or more protrusions C1, pits D1 and holes E1 and perform data expansion processing on the non-defective data 190 shown in FIG. 5A to obtain, for example, the image data shown in FIG. 5C. It is also possible to store the image data shown in FIG. 5C as defective product data 110 in the data expansion unit 140.
- the teacher object 120 by creating the teacher object 120 with a plurality of materials, it is possible to generate defective product data 110 of an intermediate material.
- the teacher object 120 is made of low carbon steel and high carbon steel, it is possible to generate defective product data 110 made of carbon steel having an intermediate carbon content. Thereby, various types of defective product data 110 can be efficiently generated from one teacher object 120.
- FIG. 1 A block diagram showing the generation system 200 of the non-defective product data 210 according to the second embodiment of the present disclosure is shown in FIG.
- the second embodiment it is possible to generate non-defective product data 210 that can further improve the recognition rate of the class by combining the data on the local shape of the non-defective product and the non-defective product data.
- the first data corresponds to the non-defective local shape data 280
- the second data corresponds to the non-defective data 290.
- the position information data 270 of the defective local shape is data regarding the position and range of the defective local shape of the teacher object 220. Therefore, it can be said that the image information not extracted by the data extraction unit 130 is not the information regarding the defective local shape. In other words, in the process of extracting information on the defective local shape, information on shapes other than the defective local shape can also be acquired.
- the shape other than the defective local shape is a good local shape.
- the non-defective local shape data 280 is information regarding the shapes of dents and protrusions that are not recognized as defective and that exist in areas other than the position and range of the defective local shape.
- the teacher object 220 is imaged by the photographing unit 250, features other than the defective local shape of the teacher object 220 are collected as the first learning data, that is, the teacher object data 260, and the good product data 290 is obtained. Data expansion processing can be performed. Since there are few non-defective products having no local shape in the actual welding process, the non-defective product data 210 generated in this embodiment has an effect of improving the determination accuracy of the learning device for non-defective products.
- the parameters of the data expansion process used in the generation of the defective product data 110 and the generation of the non-defective product data 210 are common, and it is possible to generate each data at the same time. That is, the learning data regarding defective products and non-defective products is efficiently generated from one teacher object.
- the defective product data generation system 100 and the data generation method according to the third embodiment of the present disclosure will be described.
- the defective product data generation system 100 according to the present embodiment is the same as the defective product data 110 generation system 100 according to the first embodiment shown in the block diagram of FIG. In the following, an example of a method of generating defective product data 110 will be mainly described.
- FIG. 7A shows an image (non-defective product data 190) in the vicinity of the bead B1 formed by welding the first plate B11 and the second plate B12 in a good product welding sample. Further, an image in which a part of the region R1 of the bead B1 is partially enlarged is shown in FIG. 7B.
- the first plate B11 and the second plate B12 are made of, for example, metal.
- the bead B1 is made of, for example, a metal.
- FIG. 8A shows an image (teacher object data 160) of the vicinity of the bead B1 (teacher object 120) when the first plate B11 and the second plate B12 are welded in a defective welding sample having a recess D2. .. Further, the concave portion D2 in the bead B1 is imaged, and a partially enlarged image (defective local shape data 180) is shown in FIG. 8B. In FIG. 8B, a virtual image F2 is also shown.
- FIG. 9A shows an image showing the actual shape of the recess D2 of the bead B1 in the defective weld sample.
- FIG. 9B shows a photograph showing a virtual image F2 of the recess D2 of the bead B1 in a defective welded sample.
- the actual defect of the second plate B12 is the recess D2 shown in FIG. 9A, but the recess D2 exists depending on the arrangement of the camera (shooting unit 150) included in the data extraction unit 130 of the generation system 100 and the light source inside the system.
- a virtual image F2 shown in FIG. 9B may be photographed instead of the actual appearance of the recess D2.
- the photographing unit 150 irradiates the object to be photographed with the laser and acquires the shape by the image formed by the irradiation or the time until the laser returns, the laser light repeatedly reflects or diffuses in the recess. A virtual image is created by doing so.
- not only the concave portion D2 but also the virtual image F2 may be photographed due to a defect contained in the bead B1.
- This embodiment is an embodiment relating to the generation of defective product data 110 when the virtual image F2 is photographed.
- FIG. 10 is a diagram showing a composite image of the enlarged image of the weld related to the non-defective product in the vicinity of the bead B1 and the enlarged image of the weld related to the defective product in the vicinity of the bead B1.
- image data non-defective product data 190
- image data indicating a defective product can be obtained.
- the data expansion unit 140 takes in this image data as defective product data 110 and learns it.
- FIG. 11 shows a painter in the vicinity of the bead B1, which is the welded portion between the first plate B11 and the second plate B12, in another defective welded sample.
- various defects are present in the region surrounded by the square including the region R1A, the region R1B, and the region R1C.
- An enlarged image of region R1A is shown in FIG. 12A.
- An enlarged image of region R1B is shown in FIG. 12B.
- an enlarged image of the region R1C is shown in FIG. 12C.
- 12A to 12C are diagrams showing an image of a virtual image F2 of a defect possessed by the bead B1.
- images of the virtual image F2 in another region are shown in FIGS. 13A, 13B, 13C and 13D.
- Images of the virtual image F2 in yet another region are shown in FIGS. 14A, 14B, 14C and 14D.
- an image having the virtual image F2 is obtained.
- image data showing a defective product can be obtained.
- the data expansion unit 140 takes in this image data as defective product data 110 and learns it.
- FIGS. 12A to 12C, 13A to 13D, and 14A to 14D were cut out by identifying a plurality of regions from the image of one bead B1 shown in FIG. 11 taken by the photographing unit 150. It is an image.
- the shooting conditions of the shooting unit 150 may be variously changed to set a plurality of shooting conditions, and the bead B1 may be shot under different shooting conditions to obtain a plurality of images.
- a plurality of positions of the photographing unit 150 may be set, and a predetermined area of the bead B1 may be photographed from each of the plurality of positions to obtain an image.
- the images thus obtained are similar to those shown in FIGS. 12A-14D.
- FIG. 15A shows that the first plate B11 and the second plate B12a and the second plate B12b made of two different materials, specifically different metals, are made of a metal bead B1a and a metal different from the bead B1a. It is an enlarged view at the time of welding with the bead B1b.
- FIG. 15B is a diagram showing an image of a virtual image F2A corresponding to the recess D2A in the region R1D of the bead B1a.
- FIG. 15C is a diagram showing an image of a virtual image F2B corresponding to the recess D2B in the region R1E of the bead B1a.
- the above is an example of a method of fusing the virtual image F2A and the virtual image F2B shown in FIG. 15C, and an intermediate image between the virtual image F2A and the virtual image F2B may be generated by using a method such as morphing.
- image data showing a plurality of defective products can be obtained.
- the data expansion unit 140 captures and learns the image data indicating the plurality of defective products as defective product data 110.
- FIG. 17 is a diagram showing an image in which the vicinity of the virtual image when the welded portion is photographed is cut out in a rectangular shape, and is divided into M vertically and N horizontally at equal intervals to form M ⁇ N small squares. It is explanatory drawing at the time of dividing into an image and performing a complementary process.
- FIG. 18 is an explanatory diagram showing a process of extracting only the virtual image F2 from the real image.
- M ⁇ N images are arranged at equal intervals of M in the vertical direction and N in the horizontal direction. It is divided into small square images (hereinafter referred to as pixels). Note that M and N are integers.
- the upper left pixel in FIG. 17 is labeled as (0,0), and the N pixels arranged in the horizontal direction are (1,0), ..., (n, 0), ... , (N-1,0) (n is an integer).
- the leftmost pixel in FIG. 17, the second pixel from the top is labeled as (0,1), and the pixels arranged in the horizontal direction are (1,1), ..., (N, 1).
- M ⁇ N pixels arranged in a matrix are labeled.
- the pixels in the bottom row are (0, M-1), (1, M-1), ..., (N, M-1), ..., (N-1, M-). Labeled as 1).
- the pixels arranged in a matrix of M ⁇ N are referred to as a pixel matrix.
- the brightness of the pixels in the top row, bottom row, leftmost column, and rightmost column is obtained from the brightness of the actual image included in the pixel block.
- a real image is an actual image taken by a camera.
- the brightness is defined as, for example, 1 when the image is white and 0 when the image is black, and the target image is compared with the image obtained by the mixing ratio of white and black, and the mixture of white and black when they match.
- the value of the ratio is determined.
- FIG. 19A shows a photograph of the vicinity of the bead B1 when the first plate B11 and the second plate B12 are welded in a welding sample of a defective product.
- FIG. 19A a photograph of the recess D2 in the region R1F of the bead B1 is shown in FIG. 19B. Further, the virtual image F2 obtained in (4-3) is shown in FIG. 19C.
- the size of the virtual image F2 in FIG. 19C is variously changed, and it is superimposed on the photograph of the recess D2 shown in FIG. 19B. Then, images including virtual images F2 of various sizes shown in FIGS. 20A to 20C can be obtained.
- the data expansion unit 140 takes in and learns the welding image data indicating a defective product obtained in (4-4) as defective product data 110.
- FIG. 21 shows a block diagram of the additional learning necessity device 300 according to the fourth embodiment and the welding device 301 provided with the device 300.
- the welding device 301 is controlled by the controller 302.
- the additional learning necessity device 300 has a built-in defective product data generation system 100 shown in FIG. 3, and transmits data such as defective product data 110 to the controller 302.
- the defective product data 110 generation system 100 shown in FIG. 3 is shown in the first embodiment.
- the controller 302 receives data such as defective product data 110, and controls the welding device 301 based on the data.
- the additional learning necessity device 300 includes a data generation device 101.
- the data generation device 101 has a built-in generation system 100 shown in FIG. 3, and has a data extraction unit 130 and a data expansion unit 140. Further, the data generation device 101 includes a central processing unit (CPU), a memory, and a communication device.
- CPU central processing unit
- the data extraction unit 130 is an arithmetic unit that extracts desired data, and has a photographing unit 150 and a data generation unit 161.
- the photographing unit 150 is, for example, a camera provided with an image sensor, and images the teacher object 120.
- Examples of the image pickup device include a charge junction device (CCD) and a CMOS sensor.
- FIG. 22 shows a block diagram of the data generation unit 161.
- the data generation unit 161 has, for example, an arithmetic unit 163 equipped with a CPU and a memory 162 such as a hard disk drive or SSD, and data of the teacher object 120 imaged by the photographing unit 150 (teacher object data 160). Is stored in the memory 162. Further, the data generation unit 161 acquires the position information data 170 of the defective local shape from the first data storage unit 171.
- the first data storage unit 171 has a memory such as a hard disk drive or an SSD, and stores the position information data 170 of the defective local shape.
- the data generation unit 161 calculates by combining the teacher object data 160 stored in the memory 162 and the position information data 170 of the defective local shape, generates the defective local shape data 180, and transmits the defective local shape data 180 to the data expansion unit 140.
- the data expansion unit 140 is an arithmetic unit that generates defective product data 110, and has a second data storage unit 181, a third data storage unit 191 and a data combination device 111.
- the second data storage unit 181 has a memory such as a hard disk drive or SSD, and stores the defective local shape data 180 received from the data extraction unit 130.
- the third data storage unit 191 has a memory such as a hard disk drive or SSD, and stores non-defective data 190.
- FIG. 23 shows a block diagram of the data combination device 111.
- the data combination device 111 has, for example, a calculation device 112 equipped with a CPU and a data expansion memory 113 such as a hard disk drive or SSD, and calculates by combining defective local shape data 180 and non-defective data 190. Defective product data 110 is generated. Then, the data combination device 111 learns the defective product data 110 as learning data by storing the generated defective product data 110 in the data expansion memory 113, and transmits the defective product data 110 to the controller 302.
- the controller 302 has a central processing unit, a memory, and a communication device.
- the controller 302 receives the defective product data 110 from the data combination device 111, it controls the welding device 301 to obtain data such as new non-defective product data 190 and position information data 170 of the defective local shape. These data are transmitted to the first data storage unit 171 and the third data storage unit 191 again.
- the data generation device 101 acquires data related to welding and newly generates defective product data 110.
- the controller 302 controls the welding device 301 based on these data, and causes the welding device 301 to perform welding.
- the additional learning necessity device 300 compares the data of the bead B1 obtained as a result of welding with the defective product data 110, determines whether the welding is good or bad, and determines whether or not to perform re-learning.
- the additional learning necessity device 300 determines whether the welded product is good or bad using the defective product data 110 and determines whether or not to relearn. ..
- the additional learning necessity device 300 obtains the data of the bead B1 which is the result of welding by the welding device 301 (step S1). Further, the additional learning necessity device 300 obtains defective product data 110 by using the data generation device 101 (step S2). Here, in step S2, the data generation device 101 generates defective product data 110 according to the AI model constructed in advance.
- the "pre-built AI model” may be an AI model previously provided in the additional learning necessity device 300, or may be, for example, an AI model learned immediately before by the third embodiment. good.
- the "AI model” is a model that generates defective product data 110, compares the bead B1 data described below with the obtained defective product data 110, and infers good or bad.
- the defective product data 110 is provided in the additional learning necessity device 300 in advance.
- the additional learning necessity device 300 compares the data of the bead B1 obtained according to the AI model constructed in advance with the obtained defective product data 110, and infers good / bad (step S3). ..
- the additional learning necessity device 300 determines whether the inference of good / bad obtained in step S3 is good (Yes) or bad (No) (step S4). Based on the result of good (Yes) or bad (No), it is determined whether or not to further store and learn the defective product data 110.
- step 4 If the result in step 4 is good (Yes), that is, if the good product and the defective product can be accurately selected, the additional learning necessity device 300 instructs the data generation device 101 to stop the generation of the defective product data 110 (learning). Cancellation, step S5).
- step 4 if the result in step 4 is defective, the additional learning necessity device 300 instructs the data generation device 101 to learn (step S6), that is, to newly generate defective product data 110 (step S7). Then, the additional learning necessity device 300 updates the AI model or constructs a new AI model (step S8, “new AI model”).
- step S4 a small number of data may be created by data expansion by the method of the present embodiment, and a "bad" determination may be made when the accuracy of inference for the created data is lower than a predetermined value.
- the data may be expanded to generate a larger amount of teacher data, and additional learning (re-learning) may be performed (steps S6 to S8).
- a predetermined value for example, the non-defective product rate of the bead B1 obtained by welding, that is, the number of non-defective products of the bead B1 is divided by the total number of the bead B1 to obtain a percentage.
- the "total number of bead B1" is the number of bead B1 actually welded
- the "good number of bead B1” is the number of bead B1 actually welded that is visually good. .. In this way, it is possible to determine whether the welding is good or bad with the same accuracy as visually confirmed without actually visually confirming the welding after learning.
- the additional learning necessity device 300 can be said to be a learning data generation system when viewed as a system.
- the first means are the means shown below. That is, the bead B1 is photographed by using the photographing unit 150, and an image, that is, teacher object data 160 is obtained.
- the teacher object data 160 and the position information data 170 of a plurality of defective local shapes acquired from the first data storage unit 171 are collated by the data generation unit 161 and a plurality of defective local data are extracted from the teacher object data 160. .. Then, these plurality of defective local data are fused by the data generation unit 161 to obtain defective local shape data 180.
- FIG. 25 is a block diagram of means for obtaining defective local shape data 180.
- the bead B1 is photographed by the photographing unit 150, and the teacher object data 160 is obtained.
- the obtained teacher object data 160 is, for example, an image shown in FIG.
- the data generation unit 161 collates the teacher object data 160 with the position information data 170 of a plurality of defective local shapes, and obtains defective local data ⁇ i (i is an integer from 1 to n).
- the defective local data ⁇ i is, for example, an image shown in any of FIGS. 12A to 12C, 13A to 13D, and 14A to 14D.
- the data generation unit 161 fuses the defective local data ⁇ 1, ⁇ 2, ..., ⁇ n to obtain the defective local shape data 180.
- the second means is the means shown below. That is, a plurality of shooting conditions are set by variously changing the shooting conditions of the shooting unit 150 (for example, the position of the shooting unit 150, the emission intensity of the light source used for shooting, etc.), and the bead B1 is shot under different shooting conditions.
- a plurality of images that is, a plurality of teacher object data 160.
- the teacher object data 160 may be an image of a part of the area of the teacher object 120.
- the plurality of teacher object data 160 and the position information data 170 of the defective local shape acquired from the first data storage unit 171 are collated by the data generation unit 161, and a plurality of defective local data from the plurality of teacher object data 160 are collated. Is extracted. Then, these plurality of defective local data are fused by the data generation unit 161 to obtain defective local shape data 180.
- FIG. 26 is a block diagram of means for obtaining defective local shape data 180.
- n is a natural number.
- the shooting condition is, for example, a shooting area.
- ⁇ 1, ⁇ 2, ..., ⁇ n be the plurality of teacher object data 160 captured under these n conditions, respectively.
- the data ⁇ i (i is any integer from 1 to n) is, for example, an image shown in any of FIGS. 12A-12C, 13A-13D and 14A-14D.
- the data generation unit 161 collates each data ⁇ i (i is an integer from 1 to n) with the position information data 170 of the defective local shape. Then, the data generation unit 161 fuses the data ⁇ 1, ⁇ 2, ..., ⁇ n to obtain defective local shape data 180.
- the learning data generation system using the learning data generation system has trained the Ai model in advance. This is especially useful when the shape taken in the customer environment is significantly different from the data used in advance.
- FIG. 27 shows a block diagram of the additional learning necessity device 400 according to the fifth embodiment and the welding device 401 provided with the device 400.
- the welding device 401 is controlled by the controller 402.
- the additional learning necessity device 400 has a built-in defective product data generation system 200 shown in FIG. 6, and transmits data such as non-defective product data 210 to the controller 402.
- the non-defective data 210 generation system 200 shown in FIG. 6 is shown in the second embodiment.
- the controller 402 receives data such as non-defective product data 210 and controls the welding device 401 based on the data.
- the additional learning necessity device 400 includes a data generation device 201.
- the data generation device 201 has a built-in generation system 200 shown in FIG. 6, and has a data extraction unit 230 and a data expansion unit 240. Further, the data generation device 201 includes a CPU, a memory, and a communication device.
- the data extraction unit 230 is an arithmetic unit that extracts desired data, and has a photographing unit 250 and a data generation unit 261.
- the photographing unit 250 is, for example, a camera provided with an image sensor, and images the teacher object 220.
- Examples of the image pickup device include a charge junction device (CCD) and a CMOS sensor.
- FIG. 28 shows a block diagram of the data generation unit 261.
- the data generation unit 261 has, for example, an arithmetic unit 263 equipped with a CPU and a memory 262 such as a hard disk drive or SSD.
- the memory 262 stores the data (teacher object data 260) of the teacher object 220 imaged by the photographing unit 250. Further, the data generation unit 261 acquires the position information data 270 of the defective local shape from the first data storage unit 271.
- the first data storage unit 271 has, for example, a memory such as a hard disk drive or SSD, and stores position information data 270 of a defective local shape.
- the data generation unit 261 calculates by combining the teacher object data 260 stored in the memory 262 and the position information data 270 of the defective local shape, generates the non-defective local shape data 280, and transmits it to the data expansion unit 240.
- the data expansion unit 240 is an arithmetic unit that generates non-defective data 210, and has a second data storage unit 281, a third data storage unit 291 and a data combination device 211.
- the second data storage unit 281 has a memory such as a hard disk drive or SSD, and stores non-defective local shape data 280 received from the data extraction unit 230.
- the third data storage unit 291 has a memory such as a hard disk drive or SSD, and stores non-defective data 290.
- FIG. 29 shows a block diagram of the data combination device 211.
- the data combination device 211 has, for example, a calculation device 212 equipped with a CPU and a data expansion memory 213 such as a hard disk drive or SSD, and calculates by combining non-defective local shape data 280 and non-defective data 290. Generate non-defective product data 210. Then, the data combination device 211 learns the non-defective product data 210 as learning data by storing the generated non-defective product data 210 in the data expansion memory 213, and transmits the non-defective product data 210 to the controller 402.
- the controller 402 has a central processing unit, a memory, and a communication device.
- the controller 402 receives the non-defective product data 210 from the data combination device 211, it controls the welding device 401 to obtain data such as new non-defective product data 290 and position information data 270 of the defective local shape. These data are transmitted to the first data storage unit 271 and the third data storage unit 291 again.
- the data generation device 201 acquires data related to welding and newly generates non-defective product data 210.
- the additional learning necessity device 400 causes the welding device 401 to perform welding based on these data.
- the additional learning necessity device 400 compares the bead B1 obtained as a result of welding with the non-defective product data 210, determines whether the welding is good or bad, and determines whether or not to relearn.
- the additional learning necessity device 400 uses the defective product data 110 to determine whether the welded product is good or bad, and determines whether or not to relearn, as in the fourth embodiment. That is, if the result of the good / bad determination is good, that is, if the good product and the defective product can be accurately selected, the additional learning necessity device 400 instructs the data generation device 201 to stop the generation of the good product data 210. If the result of the good / bad determination is bad, the additional learning necessity device 400 instructs the data generation device 201 to newly generate the good product data 210.
- the additional learning necessity device 400 can be said to be a learning data generation system when viewed as a system.
- the method of generating the defective local shape data 180 is the same as that of the fourth embodiment.
- FIG. 30 shows an outline of the additional learning necessity device 300 and the welding device 301 equipped with the device 300.
- the welding device 301 incorporates an additional learning necessity device 300.
- the additional learning necessity device 300 has a built-in controller 302. That is, the welding device 301 has a built-in additional learning necessity device 300 and a controller 302, which is different from the welding device 301 according to the fifth embodiment.
- the functions of the additional learning necessity device 300 and the controller 302 are the same as those of the additional learning necessity device 300 and the controller 302 shown in the fifth embodiment.
- FIG. 31 shows an outline of the additional learning necessity device 300 and the molding device 501 equipped with the device 300.
- the molding device 501 for example, injection-molds a plastic product. That is, the molding apparatus 501 manufactures a product using a plastic raw material instead of the welding apparatus 301 welding using metal. That is, instead of welding, the additional learning necessity device 300 can be applied even in the case of manufacturing, for example, by injection molding a plastic product.
- the position information data 170 of the defective local shape and the non-defective product data 190 can be obtained from the manufactured plastic product.
- the additional learning necessity device 300 is applied to welding and injection molding of plastic products, but the additional learning necessity device 300 can be applied to other than that. It can be applied to the painting and adhesion of products made of various materials.
- the additional learning necessity device 400 is applied to welding, but the additional learning necessity device 400 can be applied to other than that, for example, injection molding of plastic products and various materials. It can be applied to the painting and adhesion of various products.
- the data is expanded by this method (superimposition) and additional learning (re-learning) is performed. Learning) can be done.
- a data extension that creates a small number of data by the data extension according to the present embodiment and generates a larger number of teacher data when the accuracy of inference for the created data is lower than a predetermined value. Can be performed for additional learning (re-learning).
- the data generation system (100, 200) includes a data extraction unit (130) and a data expansion unit (140).
- the data extraction unit (130) extracts the first data (180, 280) regarding the defective local shape from the data (160, 260) of the objects (120, 220) having the defective local shape. Specifically, the data extraction unit (130) is based on the data (160, 260) of the objects (120, 220) having the defective local shape and the position information data (170, 270) of the defective local shape. 1 data (180, 280) is extracted.
- the data extension unit (140, 240) combines the first data (180, 280) with the second data (190, 290) of an object having no defective local shape.
- the object (120, 220) has a plurality of defective local shapes in the data generation system (100, 200) according to the first embodiment. Have.
- the data generation system (100, 200) according to the third embodiment is the object (120, 220) in the data generation system (100, 200) according to the first embodiment or the second embodiment. , It is made of multiple materials.
- the data extraction unit (130) is an object (120). , 220).
- the photographing unit (150, 250) is an object (120, 220). ) Is shot under multiple shooting conditions.
- the data generation system (100, 200) according to the sixth embodiment is the first data (180, 280) and the first data (180, 280) in the data generation system (100, 200) according to the first to fifth embodiments.
- the data (190, 290) of No. 2 is image information.
- the data generation system (100, 200) according to the seventh embodiment is the first data (180, 280) and the first data (180, 280) in the data generation system (100, 200) according to the first to sixth embodiments.
- the combination of the two data (190, 290) is done by data expansion based on a plurality of parameters.
- the data generation method includes a step of extracting the first data (180, 280) regarding the defective local shape from the data (160, 260) of the objects (120, 220) having the defective local shape, and the first step. It has a step of combining the data of 1 (180, 280) and the second data (190, 290) of an object having no defective local shape.
- the extraction step is based on the data (160, 260) of the object (120, 220) having the defective local shape and the position information data (170, 270) of the defective local shape, and the first data (180, 280). Includes steps to extract.
- the object (120, 220) has a plurality of defective local shapes.
- the object (120, 220) is formed of a plurality of materials.
- the data generation method according to the eleventh aspect is the data generation method according to the eighth to tenth aspects. Be prepared.
- the photographing unit (150, 250) photographs the object (120, 220) under a plurality of photographing conditions.
- the data generation method according to the thirteenth aspect is the data generation method according to the eighth to twelfth aspects, in which the first data (180, 280) and the second data (190, 290) are image information. ..
- the data generation method according to the fourteenth aspect is the data generation method according to the eighth to thirteenth aspects, wherein the combination of the first data (180, 280) and the second data (190, 290) is plural. It is done by parameter-based data expansion.
- the data generation method is a method of generating learning data (110, 210) for determining a class assigned to a local feature of an object (120, 220).
- the method is based on a sample object (120, 220) having a class of local features and a sample object (120, 270) indicating the position and range of the object (120, 220) including the local features. , 220), and the learning data (110, 210) is generated using the local features (180, 280) obtained.
- the data generation device (101) includes an extraction unit (130) and a data expansion unit (140).
- the extraction unit (130) extracts the first data (180) from the object (120) having a defective local shape.
- the extraction unit (130) includes a generation unit (161).
- the generation unit (161) extracts a plurality of data regarding the defective local shape from the object (120).
- the generation unit (161) fuses the plurality of data to generate the first data (180).
- the data expansion unit (140) includes a combination unit (111).
- the combination unit (111) combines the first data (180) and the second data (190) of an object having no defective local shape.
- the data generation device (101) includes an extraction unit (130) and a data expansion unit (140).
- the extraction unit (130) extracts the first data (180) from the object (120) having a defective local shape.
- the extraction unit (130) includes a generation unit (161).
- the generation unit (161) generates the first data (180) by fusing the data (160) taken under a plurality of shooting conditions.
- the data expansion unit (140) includes a combination unit (111).
- the combination unit (111) combines the first data (180) and the second data (190) of an object having no defective local shape.
- the additional learning necessity device infers the data (110) generated by the data generation device (101) according to the sixteenth or seventeenth embodiment using an AI model constructed in advance, and correctly infers the result. Is determined to perform additional learning when is not obtained.
- the additional learning necessity device creates a small number of data in the additional learning device according to the eighteenth embodiment, and when the accuracy of inference for the created data is lower than a predetermined value, Perform additional learning by performing data expansion that generates more teacher data.
- the additional learning necessity device includes the data generation device (110) according to the sixteenth or seventeenth, and the step (S1) of obtaining the data (B1) of the object and the first data (180). ), The data (B1) of the object and the first data (180) are compared, and the step (S3) of inferring good / bad is good or bad. Based on the step (S4) of determining whether or not the data is good or bad, it is determined whether or not to further store and learn the first data (180), and if the result is good, the additional learning is stopped (step). S5) and, if the result is poor, instruct the data generator (110) to newly generate the first data (S7) (S6), and update the AI model or build a new AI model. It has a step (S8) and.
- the data generation system, data generation method, data generation device, and additional learning necessity device of the present disclosure can be applied to, for example, a welding appearance inspection for inspecting whether or not welding is performed correctly, and are industrially useful.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Chemical & Material Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Provided is a method for creating learning data that can contribute to improving the recognition rate of a class assigned to an object. The present invention is provided with: a data extraction unit (130) that extracts first data relating to a bad local shape from data about the object having the bad local shape; and a data expansion unit (140) that combines the first data with second data about an object not having a bad local shape. The present invention is a data generation system (100) in which the data extraction unit (130) extracts the first data on the basis of the data about the object having the bad local shape, and position information data (170) about the bad local shape.
Description
本開示は、一般に学習データのデータ生成システム、データ生成方法、データ生成装置およびデータ生成装置を有する追加学習要否装置に関する。
The present disclosure generally relates to a data generation system for learning data, a data generation method, a data generation device, and an additional learning necessity device having a data generation device.
特許文献1には三次元情報をもつ設計データ又は部品を模した三次元情報をもつメッシュデータを多数の方向からレンダリングして多数の画像を生成し、生成した画像において学習対象領域を認識し、認識した学習対象領域のラベルを各画像に付加し、さらに拡張処理を施して学習データとし、生成した学習データを用いてモデル化し、分類器を生成する方法が開示されている。
In Patent Document 1, design data having three-dimensional information or mesh data having three-dimensional information imitating parts is rendered from many directions to generate a large number of images, and a learning target area is recognized in the generated images. A method of adding a label of a recognized learning target area to each image, further performing extended processing to obtain training data, modeling using the generated training data, and generating a classifier is disclosed.
物体に割り当てられるクラスを判定する際に各種のセンサが使用される。しかし、センサの機種や、取得したセンサの値に対するノイズ除去処理および検査環境の違いなどにより学習器の判定精度が落ちるという問題があった。また、学習器の判定精度を向上させるためには、学習データが必要であるが、学習データを作成するための不良データの収集は困難である。また、事前にAI(人工知能)モデルの学習を行っていても、顧客環境で撮影される形状が事前に使っていたデータと大きく異なる場合には、追加の学習が必要になる。
Various sensors are used to determine the class assigned to an object. However, there is a problem that the judgment accuracy of the learner is lowered due to the difference in the sensor model, the noise removal process for the acquired sensor value, and the inspection environment. Further, in order to improve the determination accuracy of the learning device, learning data is required, but it is difficult to collect defective data for creating the learning data. Further, even if the AI (artificial intelligence) model is learned in advance, if the shape photographed in the customer environment is significantly different from the data used in advance, additional learning is required.
上記問題を解決するために、本開示のデータ生成システムは、データ抽出部と、データ拡張部と、を備える。データ抽出部は、不良局所形状を有する物体のデータから不良局所形状に関する第1のデータを抽出する。データ拡張部は、第1のデータと、不良局所形状を有さない物体の第2のデータとを組み合わせる。データ抽出部は、不良局所形状を有する物体のデータと不良局所形状の位置情報データとに基づいて、第1のデータを抽出する。
In order to solve the above problem, the data generation system of the present disclosure includes a data extraction unit and a data extension unit. The data extraction unit extracts the first data regarding the defective local shape from the data of the object having the defective local shape. The data extension unit combines the first data with the second data of an object having no defective local shape. The data extraction unit extracts the first data based on the data of the object having the defective local shape and the position information data of the defective local shape.
本開示の学習データは、物体に割り当てられたクラスを判定する学習器の判定精度を向上させるという利点がある。
The learning data of the present disclosure has an advantage of improving the determination accuracy of the learning device that determines the class assigned to the object.
以下、本開示に係る学習データの生成システムについて説明する。なお、下記に開示される実施の形態は全て例示であって、本開示に係る学習データの生成システムに制限を加える意図はない。
Hereinafter, the learning data generation system according to the present disclosure will be described. It should be noted that all the embodiments disclosed below are examples, and there is no intention of limiting the learning data generation system according to the present disclosure.
また、以下に開示される実施の形態では、必要以上の詳細な説明を省略する場合がある。例えば、既によく知られた事項についての詳細な説明や、実質的に同一の構成についての重複する説明を、省略する場合がある。これは、説明が不必要に冗長になるのを避けることで、当業者の理解を容易にするためである。
Further, in the embodiment disclosed below, more detailed explanation than necessary may be omitted. For example, a detailed description of a well-known matter or a duplicate description of a substantially identical configuration may be omitted. This is to facilitate the understanding of those skilled in the art by avoiding unnecessarily redundant explanations.
(第一の実施形態)
(1)概要
本開示にかかる第一の実施形態に係る学習データは、物体に割り当てられたクラスを判定する基礎となるデータである。ここでいう「クラス」とは、例えば、良品と不良品といった分類であり、良品、適応品、不良品というように3つ以上のクラスでも良い。 (First Embodiment)
(1) Outline The learning data according to the first embodiment according to the present disclosure is the data that is the basis for determining the class assigned to the object. The "class" here is a classification such as a non-defective product and a defective product, and may be three or more classes such as a non-defective product, an adaptive product, and a defective product.
(1)概要
本開示にかかる第一の実施形態に係る学習データは、物体に割り当てられたクラスを判定する基礎となるデータである。ここでいう「クラス」とは、例えば、良品と不良品といった分類であり、良品、適応品、不良品というように3つ以上のクラスでも良い。 (First Embodiment)
(1) Outline The learning data according to the first embodiment according to the present disclosure is the data that is the basis for determining the class assigned to the object. The "class" here is a classification such as a non-defective product and a defective product, and may be three or more classes such as a non-defective product, an adaptive product, and a defective product.
本実施形態では、学習データを用いてクラスを判定する対象は、例えば図1に示すように、2以上の部材(ここでは、第1板B11及び第2板B12)を溶接した際に溶接箇所に形成されるビードB1とその周辺領域である。すなわち、図1は、学習データが適用される対象である、溶接箇所近傍の概要を示す図である。そして、学習データを用いた学習器は、ビードB1を含む画像データが入力されると、ビードB1の状態を推定し、推定結果を出力する。具体的には、学習器は、推定結果として、ビードB1が良品であるか不良品であるか、不良品である場合は不良品の種類を出力する。つまり、学習器は、ビードB1が良品であるか否か、言い換えれば溶接が正しく行われたか否かを検査する溶接外観検査のために用いられる。
In the present embodiment, the target for determining the class using the learning data is, for example, as shown in FIG. 1, when two or more members (here, the first plate B11 and the second plate B12) are welded, the welded portion. The bead B1 formed in and the surrounding area. That is, FIG. 1 is a diagram showing an outline of the vicinity of the welded portion to which the learning data is applied. Then, when the image data including the bead B1 is input, the learner using the learning data estimates the state of the bead B1 and outputs the estimation result. Specifically, the learning device outputs, as an estimation result, whether the bead B1 is a non-defective product or a defective product, or if it is a defective product, the type of the defective product. That is, the learner is used for a weld appearance inspection to inspect whether the bead B1 is a non-defective product, in other words, whether or not welding is performed correctly.
ビードB1が良品であるか否かは、一例として、ビードB1の長さ、ビードB1の高さ、ビードB1の立ち上がりの角度、ビードB1ののど厚、ビードB1の余盛、及びビードB1の溶接箇所の位置ずれ(ビードB1の始端のずれを含む)等の数値が、許容範囲に収まっているか否かにより判定される。例えば、上記に列挙した条件のうち1つでも許容範囲に収まっていなければ、ビードB1が不良品であると判定される。
Whether or not the bead B1 is a good product is determined by, for example, the length of the bead B1, the height of the bead B1, the rising angle of the bead B1, the throat thickness of the bead B1, the surplus of the bead B1, and the welding of the bead B1. It is determined by whether or not the numerical value such as the misalignment of the portion (including the misalignment of the start end of the bead B1) is within the permissible range. For example, if any one of the conditions listed above does not fall within the permissible range, the bead B1 is determined to be defective.
また、上述の数値を用いた判定以外に、ビードB1が良品であるか否かは、一例として、ビードB1のアンダーカットB2(図2A参照)の有無、ビードB1のピットB3(図2B参照)の有無、ビードB1のスパッタB4(図2C参照)の有無、及びビードB1の突起の有無に基づいても判定される。例えば、上記に列挙した不良部分のうち1つでも発生した場合、ビードB1が不良品であると判定される。
In addition to the judgment using the above numerical values, whether or not the bead B1 is a non-defective product is determined by, for example, whether or not the bead B1 has an undercut B2 (see FIG. 2A) and the pit B3 of the bead B1 (see FIG. 2B). It is also determined based on the presence or absence of spatter B4 (see FIG. 2C) of the bead B1 and the presence or absence of protrusions of the bead B1. For example, if any one of the defective parts listed above occurs, the bead B1 is determined to be a defective product.
ここで、認識率の高い学習器を生成するためには、認識対象の良品データおよび不良品データを含む多数の画像データを学習データ(図3参照)として用意する必要がある。しかしながら、認識対象の不良品が発生する頻度が少ない場合、認識率の高い学習器を生成するために必要な学習データが不足しがちである。
Here, in order to generate a learning device with a high recognition rate, it is necessary to prepare a large number of image data including non-defective product data and defective product data to be recognized as learning data (see FIG. 3). However, when the frequency of defective products to be recognized is low, the learning data required to generate a learning device having a high recognition rate tends to be insufficient.
そこで、CG(Computer Graphics)処理やデータ拡張処理によって、不良品を模した学習データを実験的に生成することが考えられる。例えば、実際にビードB1を撮像装置により撮影することで得られるデータについてデータ拡張(Data Augmentation)処理を実行することにより、学習データの数を増やしてモデルの機械学習を行うことが考えられる。ここでいう「データ拡張処理」は、あるデータに対して平行移動、拡大・縮小、回転、反転、又はノイズの付与等の処理を加えることで、学習用データを人為的に水増しする処理をいう。
Therefore, it is conceivable to experimentally generate learning data imitating a defective product by CG (Computer Graphics) processing or data expansion processing. For example, it is conceivable to increase the number of training data and perform machine learning of the model by executing data expansion processing on the data obtained by actually photographing the bead B1 with the imaging device. The "data expansion process" here refers to a process of artificially inflating training data by adding processing such as translation, enlargement / reduction, rotation, inversion, or addition of noise to a certain data. ..
しかしながら、このようにして得られる学習データは、実際の検査環境で得られる不良品データと大きく異なることが問題である。検査において使用されるセンサから得られる画像データには、ノイズや虚像が含まれる。したがって、検査において使用されるセンサの機種や取得した画像データに対するノイズ除去処理方法によって、学習データを用いて生成された学習器を用いて、検査対象となる画像をクラス判定する際の学習器の判定精度が異なってしまう。言い換えると、クラスの判定における学習器の判定精度を向上させるためには、検査環境の差異を考慮した学習データの生成が必要である。
However, the problem is that the learning data obtained in this way is significantly different from the defective product data obtained in the actual inspection environment. The image data obtained from the sensors used in the inspection includes noise and virtual images. Therefore, depending on the model of the sensor used in the inspection and the noise removal processing method for the acquired image data, the learning device for class-determining the image to be inspected using the learning device generated using the learning data. The judgment accuracy will be different. In other words, in order to improve the judgment accuracy of the learner in the judgment of the class, it is necessary to generate the learning data in consideration of the difference in the inspection environment.
本開示に係る実施の形態の一つは、検査環境の差異を考慮した不良品データの生成システム100である。図3は、第一の実施形態に係る不良品データ110の生成システム100を示すブロック図である。本実施形態では、検査環境と異なる環境において、ビードB1を模した局所的な不良形状(以降、不良局所形状と記載する)を有する教師物体120を生成する。検査環境において、検査者は、検査で使用するセンサを用いて、教師物体120を撮影する。教師物体120の作成者は、撮影によって得られた教師物体120に関するデータと、教師物体中の不良局所形状の位置及び範囲に関する情報に基づいて、教師物体の不良局所形状データ180を抽出する。この不良局所形状データ180と、局所形状の無い良品データ190を組み合わせることで、学習器の判定精度の向上に寄与する不良品データ110が生成される。
One of the embodiments according to the present disclosure is the defective product data generation system 100 in consideration of the difference in the inspection environment. FIG. 3 is a block diagram showing a defective product data 110 generation system 100 according to the first embodiment. In the present embodiment, a teacher object 120 having a locally defective shape (hereinafter, referred to as a defective local shape) imitating the bead B1 is generated in an environment different from the inspection environment. In the inspection environment, the inspector photographs the teacher object 120 using the sensors used in the inspection. The creator of the teacher object 120 extracts the defective local shape data 180 of the teacher object based on the data about the teacher object 120 obtained by photographing and the information about the position and range of the defective local shape in the teacher object. By combining the defective local shape data 180 and the non-defective product data 190 having no local shape, the defective product data 110 that contributes to the improvement of the determination accuracy of the learner is generated.
(2)詳細
以下、図3を参照して、不良品データの生成システム100について詳細に説明する。不良品データ110は、データ抽出部130とデータ拡張部140から生成される。 (2) Details Hereinafter, the defective productdata generation system 100 will be described in detail with reference to FIG. The defective product data 110 is generated from the data extraction unit 130 and the data expansion unit 140.
以下、図3を参照して、不良品データの生成システム100について詳細に説明する。不良品データ110は、データ抽出部130とデータ拡張部140から生成される。 (2) Details Hereinafter, the defective product
データ抽出部130は、撮影部150を含み、撮影部150から得られる教師物体データ160と教師物体120の不良局所形状の位置情報データ170とに基づいて不良局所形状データ180を生成する。
The data extraction unit 130 includes the photographing unit 150, and generates the defective local shape data 180 based on the teacher object data 160 obtained from the photographing unit 150 and the position information data 170 of the defective local shape of the teacher object 120.
教師物体120は、学習器により判定される物体(あるいは別の表現では、認識対象の物体)である。教師物体120は、例えば、溶接箇所に形成されるビードB1を含む。ビードB1を教師物体とする場合、教師物体120の材質は、金属であるが、別の実施の形態では教師物体120は、例えば樹脂で形成されたものでも良く、材質は限定されない。また、教師物体120は、同じ材質で形成されていても良く、異なる複数の素材で形成されていても良い。例えば、低炭素鋼と高炭素鋼の複合素材でも良い。
The teacher object 120 is an object determined by the learner (or, in other words, an object to be recognized). The teacher object 120 includes, for example, a bead B1 formed at a welded portion. When the bead B1 is used as a teacher object, the material of the teacher object 120 is metal, but in another embodiment, the teacher object 120 may be made of, for example, resin, and the material is not limited. Further, the teacher object 120 may be made of the same material, or may be made of a plurality of different materials. For example, a composite material of low carbon steel and high carbon steel may be used.
教師物体120は、不良局所形状を有している。図2A~図2Cは、不良局所形状を有するビードB1の断面の例を示す図である。不良局所形状は、図2A、図2Bおよび図2Cに示すように、例えば溶接時に発生するアンダーカットB2、ピットB3、スパッタB4、などである。ただし、不良局所形状は上述の形状に限定されず、例えば、突起、溶け落ちなどでもよい。また、不良局所形状は1つの教師物体120に複数存在してもよいし、複数のタイプの不良局所形状が含まれていても良い。
The teacher object 120 has a defective local shape. 2A to 2C are views showing an example of a cross section of the bead B1 having a defective local shape. As shown in FIGS. 2A, 2B and 2C, the defective local shape is, for example, undercut B2, pit B3, spatter B4, etc. generated during welding. However, the defective local shape is not limited to the above-mentioned shape, and may be, for example, a protrusion or melt-off. Further, a plurality of defective local shapes may be present in one teacher object 120, or a plurality of types of defective local shapes may be included.
不良局所形状は、実際の検査対象の物体の不良局所形状を模して作成されたものでも良いし、設計によって作成されたものでも良い。例えば、検査対象の物体が溶接によって作成される場合、不良が出やすい溶接条件で溶接を行うことで、実際の検査環境で生じる不良品の不良局所形状を模した、教師物体120の不良局所形状を作成することができる。
The defective local shape may be created by imitating the defective local shape of the object to be inspected, or may be created by design. For example, when the object to be inspected is created by welding, the defective local shape of the teacher object 120 imitates the defective local shape of the defective product that occurs in the actual inspection environment by welding under the welding conditions where defects are likely to occur. Can be created.
撮影部150は、教師物体120を撮影し、画像情報(教師物体データ160)を生成する。この時、不良局所形状を有する教師物体120は検査環境に提供される。検査環境において、教師物体120は、検査において使用される撮影部150を用いて撮影される。この撮影によって、教師物体120の形状に関する画像情報である、教師物体データ160が取得される。画像情報は3次元画像情報でも良いし、2次元画像情報でも良い。また、教師物体データ160は、3次元点群データでも良い。撮影において使用される機器は、画像情報を取得できる機器であれば良く、機器の種類、性能は問わない。
The photographing unit 150 photographs the teacher object 120 and generates image information (teacher object data 160). At this time, the teacher object 120 having a defective local shape is provided to the inspection environment. In the inspection environment, the teacher object 120 is photographed using the imaging unit 150 used in the inspection. By this photographing, the teacher object data 160, which is the image information regarding the shape of the teacher object 120, is acquired. The image information may be three-dimensional image information or two-dimensional image information. Further, the teacher object data 160 may be three-dimensional point cloud data. The device used for shooting may be any device that can acquire image information, regardless of the type and performance of the device.
なお、ここでは撮影部150を撮像装置である場合を例に説明したが、これに限らない。例えば、検査において使用される撮像装置により予め撮像された画像等データの入力を受け付ける様に構成してもよい。この場合では、撮影部150を撮影データ受付部と表記することができる。
Although the case where the photographing unit 150 is an imaging device has been described here as an example, the present invention is not limited to this. For example, it may be configured to accept the input of data such as an image captured in advance by the imaging device used in the inspection. In this case, the photographing unit 150 can be referred to as a photographing data receiving unit.
教師物体120の撮影は、複数の撮影条件によって行われる事が好ましい。例えば、教師物体120は、複数の撮影角度や複数の撮影位置で撮影される。これにより、一つの教師物体に対して効率的に形状に関する情報を収集することが可能である。
It is preferable that the teacher object 120 is photographed under a plurality of imaging conditions. For example, the teacher object 120 is photographed at a plurality of photographing angles and a plurality of photographing positions. This makes it possible to efficiently collect information on the shape of one teacher object.
教師物体120の作成者または提供者は、教師物体120の不良局所形状の位置および範囲に関する情報である、教師物体120の不良局所形状の位置情報データ170を所有している。
The creator or provider of the teacher object 120 owns the position information data 170 of the defective local shape of the teacher object 120, which is information regarding the position and range of the defective local shape of the teacher object 120.
不良局所形状の位置情報データ170は、例えば教師物体120の作成者または提供者によって定められる。このことを、図4を用いて説明する。図4は、不良品データ110の生成システム100において、教師物体120に含まれる画像データの一例を示す概要図である。当該画像データには、不良局所形状として突起C1を有している。教師物体120の作成者または提供者は、図4中の突起C1を不良局所形状と定め、不良局所形状の位置および範囲に関する情報を不良局所形状の位置情報データ170として蓄積しておく。
The position information data 170 of the defective local shape is determined, for example, by the creator or provider of the teacher object 120. This will be described with reference to FIG. FIG. 4 is a schematic diagram showing an example of image data included in the teacher object 120 in the defective product data 110 generation system 100. The image data has a protrusion C1 as a defective local shape. The creator or provider of the teacher object 120 defines the protrusion C1 in FIG. 4 as a defective local shape, and stores information on the position and range of the defective local shape as position information data 170 of the defective local shape.
データ抽出部130において、この不良局所形状の位置情報データ170に基づいて、教師物体データ160から、不良局所形状に関する情報を抽出した不良局所形状データ180が抽出される。この不良局所形状データ180は、検査環境ごとに固有の画像情報である。言い換えると、同じ教師物体120を用いても、得られる不良局所形状データ180は検査環境ごとに異なる。以上の様に、不良局所形状データ180は、不良局所形状に関する画像データであり、第1のデータとも記載され得る。
The data extraction unit 130 extracts the defective local shape data 180 from which the information related to the defective local shape is extracted from the teacher object data 160 based on the position information data 170 of the defective local shape. The defective local shape data 180 is image information unique to each inspection environment. In other words, even if the same teacher object 120 is used, the obtained defective local shape data 180 differs depending on the inspection environment. As described above, the defective local shape data 180 is image data related to the defective local shape, and can also be described as the first data.
良品データ190は、不良局所形状のない、学習器により判定される物体(あるいは別の表現では、認識対象の物体)に関する画像データであり、第2のデータとも記載され得る。良品データ190は、検査環境で得られるデータが想定されるが、検査環境外で得られるデータでも良い。
The non-defective product data 190 is image data related to an object determined by the learner (or, in other words, an object to be recognized) without a defective local shape, and can also be described as a second data. The non-defective product data 190 is assumed to be data obtained in the inspection environment, but may be data obtained outside the inspection environment.
データ拡張部140は、上述の不良局所形状データ180と良品データ190とを組み合わせること不良品データ110を生成する。
The data expansion unit 140 generates defective product data 110 by combining the above-mentioned defective local shape data 180 and non-defective product data 190.
この「組み合わせ」とは、不良局所形状データ180と良品データ190とを足し合わせる(合成する)ものでもよいし、不良局所形状データ180に対してデータ拡張処理を行ってから足し合わせる(合成する)ものでもよい。データ拡張処理は、データ拡張処理のパラメータを選択的に用いることで行われる。
This "combination" may be one in which the defective local shape data 180 and the non-defective product data 190 are added (combined), or the defective local shape data 180 is subjected to data expansion processing and then added (combined). It may be a thing. The data expansion process is performed by selectively using the parameters of the data expansion process.
なお、本開示でいう「データ拡張処理」は、不良局所形状データ180に対して実行される処理の他、不良局所形状データ180を用いずに、データ拡張処理のパラメータに基づいて新たに学習データを生成する処理を含み得る。例えば、データ拡張処理は、CG技術により、第1のデータを用いずに良品のビードB1を含む画像データ、又は不良品のビードB1を含む画像データを生成する処理を含んでいてもよい。また、データ拡張処理においては、例えば良品データに対して不良形状に関するデータを拡張してもよいし、良品データに対して不良形状以外の形状に関するデータを拡張しても良い。
In addition to the processing executed for the defective local shape data 180, the "data expansion processing" referred to in the present disclosure is newly learned data based on the parameters of the data expansion processing without using the defective local shape data 180. May include processing to generate. For example, the data expansion process may include a process of generating image data including a good bead B1 or an image data including a defective bead B1 by CG technology without using the first data. Further, in the data expansion process, for example, the data related to the defective shape may be expanded to the non-defective product data, or the data related to the shape other than the defective shape may be expanded to the non-defective product data.
また、本開示でいう「データ拡張処理のパラメータ」は、処理対象のデータの一部又は全部に対して実行される平行移動、拡大・縮小、回転、反転、又はノイズの付与等のデータ拡張処理の度合いをいう。例えば、表面に突起を有する不良品のビードB1の画像データを処理対象のデータとした場合、データ拡張処理のパラメータは、突起を移動させる移動量、突起の寸法、突起の回転量等を含み得る。ここで、データ拡張処理のパラメータは、処理の種類ごとに変更可能な範囲が設定されている。例えば、パラメータが突起を移動させる移動量の場合、移動量は、0~数十mmの範囲で変更可能である。なお、データ拡張処理のパラメータは、1値であってもよい。
Further, the "parameter of data expansion processing" referred to in the present disclosure is data expansion processing such as translation, enlargement / reduction, rotation, inversion, or addition of noise, which is executed for a part or all of the data to be processed. The degree of. For example, when the image data of the defective bead B1 having protrusions on the surface is used as the data to be processed, the parameters of the data expansion processing may include the amount of movement to move the protrusions, the size of the protrusions, the amount of rotation of the protrusions, and the like. .. Here, the parameters of the data expansion process are set in a range that can be changed for each type of process. For example, when the parameter is the movement amount for moving the protrusion, the movement amount can be changed in the range of 0 to several tens of mm. The parameter of the data expansion process may be one value.
本実施形態では、データ拡張処理のパラメータは、複数種類存在してもよい。例えば、データ拡張処理において、学習データに基づいて生成した局所形状を重畳する位置やサイズの他に、縦横比、形状、角度、底面形状、深さ、平滑比といったパラメータから1以上のパラメータを適宜選択して用いることができる。そして、複数種類のパラメータは、それぞれ変更可能な範囲が設定されている。
In this embodiment, there may be a plurality of types of data expansion processing parameters. For example, in the data expansion process, in addition to the position and size on which the local shape generated based on the learning data is superimposed, one or more parameters are appropriately selected from the parameters such as aspect ratio, shape, angle, bottom shape, depth, and smoothness ratio. It can be selected and used. A range that can be changed is set for each of the plurality of types of parameters.
第1のデータを用いて、複数種類のパラメータのうち1以上のパラメータを選択し、変更可能な範囲内で各パラメータに対する処理量を変更させながら、第2のデータに対してデータ拡張処理が順次実行される。このデータ拡張処理によって、1つの第1のデータをもとにして、多数の学習データが生成される。本実施形態で言えば、1つの教師物体120から得られる不良局所形状データ180を良品データ190に対してデータ拡張することで、多数の不良品データ110が生成される。
Using the first data, one or more parameters are selected from a plurality of types of parameters, and the data expansion processing is sequentially performed on the second data while changing the processing amount for each parameter within the changeable range. Will be executed. By this data expansion process, a large number of training data are generated based on one first data. According to the present embodiment, a large number of defective product data 110 are generated by expanding the defective local shape data 180 obtained from one teacher object 120 with respect to the non-defective product data 190.
一例として、図4に示すような不良局所形状を含む溶接サンプルを作成した、と仮定する。この溶接サンプルは、ビードB1の表面から突起C1が突出している不良品である。溶接サンプルの作成者または提供者は、この溶接サンプルに存在する突起C1の位置および範囲に関する情報を所有している。検査環境において、この溶接サンプルは撮影され、画像情報が取得される。この画像情報と、突起C1の位置および範囲に関する情報とに基づいて、突起C1の形状に関する画像情報が抽出される。この抽出した突起C1の形状に関する情報をもとに、例えば、不良局所形状を持たない溶接サンプルの画像データに対してデータ拡張を実施することで、新たに不良局所形状を有する不良品の溶接サンプルの画像データが、生成される。その際、前述のデータ拡張処理のパラメータのうち、一つまたは複数を用いてデータ拡張を行うことで、一つの教師物体120から多種多様な不良品データ110が生成される。例えば、突起C1の大きさ、位置、角度および個数を変えた溶接サンプルの不良品データ110の生成が可能である。
As an example, it is assumed that a welded sample containing a defective local shape as shown in FIG. 4 is prepared. This welded sample is a defective product in which the protrusion C1 protrudes from the surface of the bead B1. The creator or provider of the weld sample possesses information about the location and extent of the protrusion C1 present in this weld sample. In the inspection environment, this weld sample is photographed and image information is acquired. Based on this image information and the information regarding the position and range of the protrusion C1, the image information regarding the shape of the protrusion C1 is extracted. Based on the information on the shape of the extracted protrusion C1, for example, by expanding the data on the image data of the welding sample having no defective local shape, the welding sample of the defective product having a new defective local shape is newly expanded. Image data is generated. At that time, by performing data expansion using one or a plurality of the parameters of the above-mentioned data expansion processing, a wide variety of defective product data 110 is generated from one teacher object 120. For example, it is possible to generate defective data 110 of a welding sample in which the size, position, angle and number of protrusions C1 are changed.
一例として、図5Aに示すような画像データを含む良品データ190が存在する、と仮定する。この画像データは、不良局所形状をもたない、いわゆる良品の溶接サンプルのデータである。上述のデータ抽出部130で、ビードB1の教師物体の表面の突起C1の形状に関する情報が抽出されたとする。図5Aの良品のビードB1に対して、突起C1を追加するデータ拡張処理を実行することで、図5Bに示すような画像データの生成が可能である。その際、前述のデータ拡張処理のパラメータに関する記載にあるように、突起C1の位置、大きさおよび角度等を自由に変えることができる。この図5Bに示す画像データを不良品データ110としてデータ拡張部140に格納する。
As an example, it is assumed that non-defective product data 190 including image data as shown in FIG. 5A exists. This image data is data of a so-called non-defective welded sample having no defective local shape. It is assumed that the data extraction unit 130 described above has extracted information regarding the shape of the protrusion C1 on the surface of the teacher object of the bead B1. By executing the data expansion process of adding the protrusion C1 to the non-defective bead B1 of FIG. 5A, it is possible to generate the image data as shown in FIG. 5B. At that time, the position, size, angle, and the like of the protrusion C1 can be freely changed as described in the above-mentioned description regarding the parameters of the data expansion process. The image data shown in FIG. 5B is stored in the data expansion unit 140 as defective product data 110.
また、教師物体120を複数の素材で作成することで、中間的素材の不良品データ110の生成が可能である。例えば、教師物体120を低炭素鋼と高炭素鋼で作成した場合、中間的な炭素含有量の炭素鋼でできた不良品データ110の生成が可能である。これにより、1つの教師物体120から様々なタイプの不良品データ110を効率良く生成することができる。
Further, by creating the teacher object 120 with a plurality of materials, it is possible to generate defective product data 110 of an intermediate material. For example, when the teacher object 120 is made of low carbon steel and high carbon steel, it is possible to generate defective product data 110 made of carbon steel having an intermediate carbon content. Thereby, various types of defective product data 110 can be efficiently generated from one teacher object 120.
例えば、教師物体120から得られる不良局所形状データ180として、突起C1だけでなく、図5Cに例を示すようなピットD1やホールE1がある。突起C1、ピットD1およびホールE1を単数または複数選択して、図5Aに示す良品データ190にデータ拡張処理をして、例えば図5Cに示す画像データを得ることも可能である。この図5Cに示す画像データを不良品データ110としてデータ拡張部140に格納することも可能である。
For example, as the defective local shape data 180 obtained from the teacher object 120, there are not only the protrusion C1 but also the pit D1 and the hole E1 as shown in FIG. 5C. It is also possible to select one or more protrusions C1, pits D1 and holes E1 and perform data expansion processing on the non-defective data 190 shown in FIG. 5A to obtain, for example, the image data shown in FIG. 5C. It is also possible to store the image data shown in FIG. 5C as defective product data 110 in the data expansion unit 140.
また、教師物体120を複数の素材で作成することで、中間的素材の不良品データ110の生成が可能である。例えば、教師物体120を低炭素鋼と高炭素鋼で作成した場合、中間的な炭素含有量の炭素鋼でできた不良品データ110の生成が可能である。これにより、1つの教師物体120から様々なタイプの不良品データ110を効率良く生成することができる。
Further, by creating the teacher object 120 with a plurality of materials, it is possible to generate defective product data 110 of an intermediate material. For example, when the teacher object 120 is made of low carbon steel and high carbon steel, it is possible to generate defective product data 110 made of carbon steel having an intermediate carbon content. Thereby, various types of defective product data 110 can be efficiently generated from one teacher object 120.
(第二の実施形態)
本開示の第二の実施形態に係る良品データ210の生成システム200を示すブロック図を、図6に示す。 (Second embodiment)
A block diagram showing thegeneration system 200 of the non-defective product data 210 according to the second embodiment of the present disclosure is shown in FIG.
本開示の第二の実施形態に係る良品データ210の生成システム200を示すブロック図を、図6に示す。 (Second embodiment)
A block diagram showing the
第二の実施形態では、良品の局所的形状に関するデータと良品データとを組み合わせて、さらにクラスの認識率を向上させ得る良品データ210の生成が可能である。この場合、図6に示すように第1のデータが良品局所形状データ280、第2のデータが良品データ290、にそれぞれ対応している。不良局所形状の位置情報データ270は、教師物体220の不良局所形状の位置および範囲に関するデータである。したがって、データ抽出部130において、抽出されなかった画像情報は、不良局所形状に関する情報でないといえる。言い換えると、不良局所形状に関する情報を抽出する過程において、不良局所形状以外の形状に関する情報も取得できる。不良局所形状以外の形状は、すなわち良品の局所形状である。この場合、良品局所形状データ280は、不良局所形状の位置および範囲以外に存在する、不良と認識されなかった窪みや突起の形状に関する情報である。この良品局所形状データ280を良品データ290に対してデータ拡張処理を行うことで、新たに良品データ210が生成される。
In the second embodiment, it is possible to generate non-defective product data 210 that can further improve the recognition rate of the class by combining the data on the local shape of the non-defective product and the non-defective product data. In this case, as shown in FIG. 6, the first data corresponds to the non-defective local shape data 280, and the second data corresponds to the non-defective data 290. The position information data 270 of the defective local shape is data regarding the position and range of the defective local shape of the teacher object 220. Therefore, it can be said that the image information not extracted by the data extraction unit 130 is not the information regarding the defective local shape. In other words, in the process of extracting information on the defective local shape, information on shapes other than the defective local shape can also be acquired. The shape other than the defective local shape is a good local shape. In this case, the non-defective local shape data 280 is information regarding the shapes of dents and protrusions that are not recognized as defective and that exist in areas other than the position and range of the defective local shape. By performing data expansion processing on the non-defective product local shape data 280 with respect to the non-defective product data 290, the non-defective product data 210 is newly generated.
したがって、第二の実施形態では、教師物体220を撮影部250で撮像し、教師物体220の不良局所形状以外の特徴を第1の学習データすなわち教師物体データ260として収集し、良品データ290に対してデータ拡張処理を行うことができる。実際の溶接工程において、局所形状の一切無い良品は少ないため、この実施形態において生成される良品データ210は、学習器の良品に対する判定精度を向上させる効果がある。
Therefore, in the second embodiment, the teacher object 220 is imaged by the photographing unit 250, features other than the defective local shape of the teacher object 220 are collected as the first learning data, that is, the teacher object data 260, and the good product data 290 is obtained. Data expansion processing can be performed. Since there are few non-defective products having no local shape in the actual welding process, the non-defective product data 210 generated in this embodiment has an effect of improving the determination accuracy of the learning device for non-defective products.
なお、不良品データ110の生成と良品データ210の生成において使用するデータ拡張処理のパラメータは共通であり、それぞれのデータの生成を同時に行うことも可能である。つまり、不良品および良品に関する学習データは、一つの教師物体から効率よく生成される。
Note that the parameters of the data expansion process used in the generation of the defective product data 110 and the generation of the non-defective product data 210 are common, and it is possible to generate each data at the same time. That is, the learning data regarding defective products and non-defective products is efficiently generated from one teacher object.
(第三の実施形態)
本開示の第三の実施形態にかかる不良品データの生成システム100及びデータ生成方法について説明する。本実施形態にかかる不良品データの生成システム100は、図3のブロック図に示される、第一の実施形態に係る不良品データ110の生成システム100と同様である。以下においては、主に不良品データ110の生成方法の例について説明する。 (Third embodiment)
The defective productdata generation system 100 and the data generation method according to the third embodiment of the present disclosure will be described. The defective product data generation system 100 according to the present embodiment is the same as the defective product data 110 generation system 100 according to the first embodiment shown in the block diagram of FIG. In the following, an example of a method of generating defective product data 110 will be mainly described.
本開示の第三の実施形態にかかる不良品データの生成システム100及びデータ生成方法について説明する。本実施形態にかかる不良品データの生成システム100は、図3のブロック図に示される、第一の実施形態に係る不良品データ110の生成システム100と同様である。以下においては、主に不良品データ110の生成方法の例について説明する。 (Third embodiment)
The defective product
(1)第一実施例
第一実施例における不良品データ110の生成方法について以下に述べる。 (1) First Example The method of generatingdefective product data 110 in the first embodiment will be described below.
第一実施例における不良品データ110の生成方法について以下に述べる。 (1) First Example The method of generating
良品の溶接サンプルにおける、第1板B11と第2板B12とを溶接してできるビードB1の近傍の画像(良品データ190)を図7Aに示す。また、このビードB1の一部の領域R1を部分的に拡大した画像を図7Bに示す。なお、第1板B11および第2板B12は、例えば金属からなる。また、ビードB1は、例えば金属からなる。
FIG. 7A shows an image (non-defective product data 190) in the vicinity of the bead B1 formed by welding the first plate B11 and the second plate B12 in a good product welding sample. Further, an image in which a part of the region R1 of the bead B1 is partially enlarged is shown in FIG. 7B. The first plate B11 and the second plate B12 are made of, for example, metal. The bead B1 is made of, for example, a metal.
一方、凹部D2を有する不良品の溶接サンプルにおける、第1板B11と第2板B12とを溶接したときのビードB1の近傍(教師物体120)の画像(教師物体データ160)を図8Aに示す。また、このビードB1における凹部D2を撮像し、部分的に拡大した画像(不良局所形状データ180)を図8Bに示す。図8Bにおいては、虚像F2も示されている。
On the other hand, FIG. 8A shows an image (teacher object data 160) of the vicinity of the bead B1 (teacher object 120) when the first plate B11 and the second plate B12 are welded in a defective welding sample having a recess D2. .. Further, the concave portion D2 in the bead B1 is imaged, and a partially enlarged image (defective local shape data 180) is shown in FIG. 8B. In FIG. 8B, a virtual image F2 is also shown.
次に、虚像F2について、図9Aおよび図9Bを用いて説明する。図9Aは、不良品の溶接サンプルにおいてビードB1が有する凹部D2の実際の形状を表す画像を示す。図9Bは、不良品の溶接サンプルにおいてビードB1が有する凹部D2の虚像F2を示す写真を示す。
Next, the virtual image F2 will be described with reference to FIGS. 9A and 9B. FIG. 9A shows an image showing the actual shape of the recess D2 of the bead B1 in the defective weld sample. FIG. 9B shows a photograph showing a virtual image F2 of the recess D2 of the bead B1 in a defective welded sample.
第2板B12が有する実際の欠陥は図9Aに示す凹部D2であるところ、生成システム100のデータ抽出部130が有するカメラ(撮影部150)やシステム内部の光源の配置によっては凹部D2が存在する位置に、凹部D2の実際の姿ではなく、図9Bに示す虚像F2が撮影される場合がある。また、撮影部150が、撮影対象物にレーザーを照射し、照射によってできる像や、レーザーが戻るまでの時間によって形状を取得する構成では、凹部の中でレーザー光が反射を繰り返したり、拡散したりすることによって虚像が生じる。また、凹部D2に限らず、ビードB1が有する欠陥によって虚像F2が撮影される場合がある。本実施形態は、虚像F2が撮影される場合の、不良品データ110の生成に関する実施形態である。
The actual defect of the second plate B12 is the recess D2 shown in FIG. 9A, but the recess D2 exists depending on the arrangement of the camera (shooting unit 150) included in the data extraction unit 130 of the generation system 100 and the light source inside the system. At the position, a virtual image F2 shown in FIG. 9B may be photographed instead of the actual appearance of the recess D2. Further, in the configuration in which the photographing unit 150 irradiates the object to be photographed with the laser and acquires the shape by the image formed by the irradiation or the time until the laser returns, the laser light repeatedly reflects or diffuses in the recess. A virtual image is created by doing so. Further, not only the concave portion D2 but also the virtual image F2 may be photographed due to a defect contained in the bead B1. This embodiment is an embodiment relating to the generation of defective product data 110 when the virtual image F2 is photographed.
図7Bに示す画像に図8Bに示す画像を重ね合わせると、図10に示す虚像F2を有する画像が得られる。すなわち、図10は、良品に係る溶接の、ビードB1近傍の拡大画像と、不良品に係る溶接の、ビードB1近傍の拡大画像との合成画像を示す図である。これを図7Aに示す画像データ(良品データ190)に重ね合わせることで、不良品を示す画像データが得られる、この画像データを、データ拡張部140が不良品データ110として取り込み、学習する。
By superimposing the image shown in FIG. 8B on the image shown in FIG. 7B, an image having the virtual image F2 shown in FIG. 10 can be obtained. That is, FIG. 10 is a diagram showing a composite image of the enlarged image of the weld related to the non-defective product in the vicinity of the bead B1 and the enlarged image of the weld related to the defective product in the vicinity of the bead B1. By superimposing this on the image data (non-defective product data 190) shown in FIG. 7A, image data indicating a defective product can be obtained. The data expansion unit 140 takes in this image data as defective product data 110 and learns it.
(2)第二実施例
第二実施例における不良品データ110の生成方法について以下に述べる。 (2) Second Example The method of generatingdefective product data 110 in the second embodiment will be described below.
第二実施例における不良品データ110の生成方法について以下に述べる。 (2) Second Example The method of generating
別の不良品の溶接サンプルにおける、第1板B11と第2板B12との溶接部であるビードB1の近傍の画僧を図11に示す。図11に示す溶接サンプルについては、領域R1A、領域R1Bおよび領域R1Cを含む、四角で囲った領域に、様々な欠陥が存在する。領域R1Aの拡大画像を図12Aに示す。領域R1Bの拡大画像を図12Bに示す。また、領域R1Cの拡大画像を図12Cに示す。図12A~図12Cは、ビードB1が有する欠陥の虚像F2の画像を示す図である。
FIG. 11 shows a painter in the vicinity of the bead B1, which is the welded portion between the first plate B11 and the second plate B12, in another defective welded sample. In the welding sample shown in FIG. 11, various defects are present in the region surrounded by the square including the region R1A, the region R1B, and the region R1C. An enlarged image of region R1A is shown in FIG. 12A. An enlarged image of region R1B is shown in FIG. 12B. Further, an enlarged image of the region R1C is shown in FIG. 12C. 12A to 12C are diagrams showing an image of a virtual image F2 of a defect possessed by the bead B1.
また、別の領域における虚像F2の画像を、図13A、図13B、図13Cおよび図13Dに示す。さらに別の領域における虚像F2の画像を、図14A、図14B、図14Cおよび図14Dに示す。これらの領域における虚像F2の画像を図7Bに示す画像と重ね合わせると、虚像F2を有する画像が得られる。この虚像F2を有する画像を図7Aに示す画像データに重ね合わせることで、不良品を示す画像データが得られる。この画像データを、データ拡張部140が不良品データ110として取り込み、学習する。
Also, images of the virtual image F2 in another region are shown in FIGS. 13A, 13B, 13C and 13D. Images of the virtual image F2 in yet another region are shown in FIGS. 14A, 14B, 14C and 14D. By superimposing the image of the virtual image F2 in these regions with the image shown in FIG. 7B, an image having the virtual image F2 is obtained. By superimposing the image having the virtual image F2 on the image data shown in FIG. 7A, image data showing a defective product can be obtained. The data expansion unit 140 takes in this image data as defective product data 110 and learns it.
なお、図12A~図12C、図13A~図13Dおよび図14A~図14Dの画像は、撮影部150で撮影した、図11で示す1つのビードB1の画像から複数の領域を特定して切り取った画像である。しかし、撮影部150の撮影条件を様々に変更して複数の撮影条件を設定し、それぞれ異なる撮影条件でもってビードB1を撮影して複数の画像を得てもよい。例えば、撮影部150の位置を複数に設定し、複数の位置の各々からビードB1の所定の領域を撮影し、画像を得てもよい。そのようにして得られた画像は、図12A~図14Dに示すのと同様な画像となる。
The images of FIGS. 12A to 12C, 13A to 13D, and 14A to 14D were cut out by identifying a plurality of regions from the image of one bead B1 shown in FIG. 11 taken by the photographing unit 150. It is an image. However, the shooting conditions of the shooting unit 150 may be variously changed to set a plurality of shooting conditions, and the bead B1 may be shot under different shooting conditions to obtain a plurality of images. For example, a plurality of positions of the photographing unit 150 may be set, and a predetermined area of the bead B1 may be photographed from each of the plurality of positions to obtain an image. The images thus obtained are similar to those shown in FIGS. 12A-14D.
(3)第三実施例
第三実施例における不良品データ110の生成方法について以下に述べる。 (3) Third Example The method of generatingdefective product data 110 in the third embodiment will be described below.
第三実施例における不良品データ110の生成方法について以下に述べる。 (3) Third Example The method of generating
第1板B11と2種類の金属よりなる第2板B12a、第2板B12bとを溶接するにあたり、2種類の金属を溶接した溶接サンプルについて説明する。
In welding the first plate B11 and the second plate B12a and the second plate B12b made of two kinds of metals, a welding sample in which two kinds of metals are welded will be described.
図15Aは、第1板B11と、異なる2つの材料、具体的には互いに異なる金属でできた第2板B12aおよび第2板B12bを、金属よりなるビードB1a、およびビードB1aとは異なる金属よりなるビードB1bで溶接したときの拡大図である。図15BはビードB1aの領域R1Dにおける凹部D2Aに対応する虚像F2Aの画像を示す図である。図15CはビードB1aの領域R1Eにおける凹部D2Bに対応する虚像F2Bの画像を示す図である。
FIG. 15A shows that the first plate B11 and the second plate B12a and the second plate B12b made of two different materials, specifically different metals, are made of a metal bead B1a and a metal different from the bead B1a. It is an enlarged view at the time of welding with the bead B1b. FIG. 15B is a diagram showing an image of a virtual image F2A corresponding to the recess D2A in the region R1D of the bead B1a. FIG. 15C is a diagram showing an image of a virtual image F2B corresponding to the recess D2B in the region R1E of the bead B1a.
図16A~図16Cは、図15Bに示す虚像F2Aと図15Cに示す虚像F2Bとを、それぞれ所定の倍率を乗算して重ね合わせた虚像の画像である。具体的には、虚像F2Aの高さおよび幅をα倍、虚像F2Bの高さおよび幅を(1-α)倍(但し、0<α<1)したときの虚像の画像である。より具体的には、図16Aはα=0.2のときの虚像の画像、図16Bはα=0.5のときの虚像の画像、図16Cはα=0.8のときの虚像の画像である。なお上記は虚像F2Aと図15Cに示す虚像F2Bとを融合する方法の一例であって、モーフィングなどの手法を用いて虚像F2Aと虚像F2Bの中間の画像を生成しても構わない。
16A to 16C are virtual images in which the virtual image F2A shown in FIG. 15B and the virtual image F2B shown in FIG. 15C are superimposed by multiplying them by a predetermined magnification. Specifically, it is an image of a virtual image when the height and width of the virtual image F2A are multiplied by α and the height and width of the virtual image F2B are multiplied by (1-α) (however, 0 <α <1). More specifically, FIG. 16A is an image of a virtual image when α = 0.2, FIG. 16B is an image of a virtual image when α = 0.5, and FIG. 16C is an image of a virtual image when α = 0.8. Is. The above is an example of a method of fusing the virtual image F2A and the virtual image F2B shown in FIG. 15C, and an intermediate image between the virtual image F2A and the virtual image F2B may be generated by using a method such as morphing.
これらの虚像の画像を良品の画像データに重ね合わせることで、複数の不良品を示す画像データが得られる。これら複数の不良品を示す画像データを、データ拡張部140が不良品データ110として取り込み、学習する。
By superimposing the images of these virtual images on the image data of non-defective products, image data showing a plurality of defective products can be obtained. The data expansion unit 140 captures and learns the image data indicating the plurality of defective products as defective product data 110.
(4)第四実施例
次に、虚像を得る方法、および不良品の溶接データの生成方法について、第四実施例として、図17~図20Cを用いて説明する。当該虚像を得る方法、および不良品の溶接データの生成方法は、データ拡張部140によって行われる。 (4) Fourth Example Next, a method of obtaining a virtual image and a method of generating welding data of defective products will be described with reference to FIGS. 17 to 20C as the fourth embodiment. The method of obtaining the virtual image and the method of generating welding data for defective products are performed by thedata expansion unit 140.
次に、虚像を得る方法、および不良品の溶接データの生成方法について、第四実施例として、図17~図20Cを用いて説明する。当該虚像を得る方法、および不良品の溶接データの生成方法は、データ拡張部140によって行われる。 (4) Fourth Example Next, a method of obtaining a virtual image and a method of generating welding data of defective products will be described with reference to FIGS. 17 to 20C as the fourth embodiment. The method of obtaining the virtual image and the method of generating welding data for defective products are performed by the
図17は、溶接部を撮影したときの虚像の近傍を矩形形状に切り取った画像を示す図であり、縦にM個、横にN個等間隔に分割してM×N個の小さな正方形の画像に分割し補完処理をしたときの説明図である。
FIG. 17 is a diagram showing an image in which the vicinity of the virtual image when the welded portion is photographed is cut out in a rectangular shape, and is divided into M vertically and N horizontally at equal intervals to form M × N small squares. It is explanatory drawing at the time of dividing into an image and performing a complementary process.
図18は、実画像から虚像F2のみを抜き出す工程を示す説明図である。
FIG. 18 is an explanatory diagram showing a process of extracting only the virtual image F2 from the real image.
(4-1)輝度の補完値について
まず、虚像を含む矩形形状の画像について、図17に示すように、画像を縦にM個、横にN個等間隔になるように、M×N個の小さな正方形の画像(以下、画素という)に分割する。なお、M,Nは整数である。次に、図17のいちばん左上の画素を(0,0)とラベル付けして横方向に配列するN個の画素を(1,0)、・・・、(n,0)、・・・、(N-1,0)(nは整数)とラベル付けする。次に、図17のいちばん左にあり、上から2番目にある画素を(0,1)とラベル付けし、横方向に配列する画素を(1,1)、・・・、(n,1)、・・・、(N-1,1)とラベル付けする。同様にして次に、図17のいちばん左にあり、上からm+1番目(mは整数)にある画素を(0,m)とラベル付けし、横方向に配列する画素を(1,m)、・・・、(n,m)、・・・、(N-1,m)とラベル付けする。このようなラベル付けを上からM番目まで行い、M×N個のマトリクス状に配置された画素のラベル付けを行う。なお、一番下の列の画素は、(0,M-1)、(1,M-1)、・・・、(n,M-1)、・・・、(N-1,M-1)とラベル付けされる。以下、このM×N個のマトリクス状に配置された画素を、画素マトリクスという。 (4-1) Complementary Value of Luminance First, as shown in FIG. 17, for a rectangular image including a virtual image, M × N images are arranged at equal intervals of M in the vertical direction and N in the horizontal direction. It is divided into small square images (hereinafter referred to as pixels). Note that M and N are integers. Next, the upper left pixel in FIG. 17 is labeled as (0,0), and the N pixels arranged in the horizontal direction are (1,0), ..., (n, 0), ... , (N-1,0) (n is an integer). Next, the leftmost pixel in FIG. 17, the second pixel from the top, is labeled as (0,1), and the pixels arranged in the horizontal direction are (1,1), ..., (N, 1). ), ..., (N-1,1). Similarly, next, the pixel on the far left of FIG. 17, which is m + 1th (m is an integer) from the top, is labeled as (0, m), and the pixels arranged in the horizontal direction are (1, m). ..., (n, m), ..., (N-1, m). Such labeling is performed from the top to the Mth, and M × N pixels arranged in a matrix are labeled. The pixels in the bottom row are (0, M-1), (1, M-1), ..., (N, M-1), ..., (N-1, M-). Labeled as 1). Hereinafter, the pixels arranged in a matrix of M × N are referred to as a pixel matrix.
まず、虚像を含む矩形形状の画像について、図17に示すように、画像を縦にM個、横にN個等間隔になるように、M×N個の小さな正方形の画像(以下、画素という)に分割する。なお、M,Nは整数である。次に、図17のいちばん左上の画素を(0,0)とラベル付けして横方向に配列するN個の画素を(1,0)、・・・、(n,0)、・・・、(N-1,0)(nは整数)とラベル付けする。次に、図17のいちばん左にあり、上から2番目にある画素を(0,1)とラベル付けし、横方向に配列する画素を(1,1)、・・・、(n,1)、・・・、(N-1,1)とラベル付けする。同様にして次に、図17のいちばん左にあり、上からm+1番目(mは整数)にある画素を(0,m)とラベル付けし、横方向に配列する画素を(1,m)、・・・、(n,m)、・・・、(N-1,m)とラベル付けする。このようなラベル付けを上からM番目まで行い、M×N個のマトリクス状に配置された画素のラベル付けを行う。なお、一番下の列の画素は、(0,M-1)、(1,M-1)、・・・、(n,M-1)、・・・、(N-1,M-1)とラベル付けされる。以下、このM×N個のマトリクス状に配置された画素を、画素マトリクスという。 (4-1) Complementary Value of Luminance First, as shown in FIG. 17, for a rectangular image including a virtual image, M × N images are arranged at equal intervals of M in the vertical direction and N in the horizontal direction. It is divided into small square images (hereinafter referred to as pixels). Note that M and N are integers. Next, the upper left pixel in FIG. 17 is labeled as (0,0), and the N pixels arranged in the horizontal direction are (1,0), ..., (n, 0), ... , (N-1,0) (n is an integer). Next, the leftmost pixel in FIG. 17, the second pixel from the top, is labeled as (0,1), and the pixels arranged in the horizontal direction are (1,1), ..., (N, 1). ), ..., (N-1,1). Similarly, next, the pixel on the far left of FIG. 17, which is m + 1th (m is an integer) from the top, is labeled as (0, m), and the pixels arranged in the horizontal direction are (1, m). ..., (n, m), ..., (N-1, m). Such labeling is performed from the top to the Mth, and M × N pixels arranged in a matrix are labeled. The pixels in the bottom row are (0, M-1), (1, M-1), ..., (N, M-1), ..., (N-1, M-). Labeled as 1). Hereinafter, the pixels arranged in a matrix of M × N are referred to as a pixel matrix.
次に、画素マトリクスのうち、最外縁にある画素すなわち、
最上行:(0,0)、(1,0)、・・・、(n,0)、・・・、(N-1,0)
最下行:(0,M-1)、(1,M-1)、・・・、(n,M-1)、・・・、(N-1,M-1)
最左列:(0,0)、(0,1)、・・・、(0,m)、・・・、(0,M-1)
最右列:(N-1,0)、(N-1,1)、・・・、(N-1,m)、・・・、(N-1,M-1)
を除く部分(n,m)の画素(1≦m≦M-2、1≦n≦N-2)について、輝度の値を補完値P(n,m)として算出する。当該補完値の算出方法を以下に説明する。 Next, in the pixel matrix, the pixel at the outermost edge, that is,
Top line: (0,0), (1,0), ..., (n, 0), ..., (N-1,0)
Bottom line: (0, M-1), (1, M-1), ..., (n, M-1), ..., (N-1, M-1)
Leftmost column: (0,0), (0,1), ..., (0, m), ..., (0, M-1)
Rightmost column: (N-1,0), (N-1,1), ..., (N-1, m), ..., (N-1, M-1)
The brightness value is calculated as the complementary value P (n, m) for the pixels (1 ≦ m ≦ M-2, 1 ≦ n ≦ N-2) of the portion (n, m) excluding. The calculation method of the complementary value will be described below.
最上行:(0,0)、(1,0)、・・・、(n,0)、・・・、(N-1,0)
最下行:(0,M-1)、(1,M-1)、・・・、(n,M-1)、・・・、(N-1,M-1)
最左列:(0,0)、(0,1)、・・・、(0,m)、・・・、(0,M-1)
最右列:(N-1,0)、(N-1,1)、・・・、(N-1,m)、・・・、(N-1,M-1)
を除く部分(n,m)の画素(1≦m≦M-2、1≦n≦N-2)について、輝度の値を補完値P(n,m)として算出する。当該補完値の算出方法を以下に説明する。 Next, in the pixel matrix, the pixel at the outermost edge, that is,
Top line: (0,0), (1,0), ..., (n, 0), ..., (N-1,0)
Bottom line: (0, M-1), (1, M-1), ..., (n, M-1), ..., (N-1, M-1)
Leftmost column: (0,0), (0,1), ..., (0, m), ..., (0, M-1)
Rightmost column: (N-1,0), (N-1,1), ..., (N-1, m), ..., (N-1, M-1)
The brightness value is calculated as the complementary value P (n, m) for the pixels (1 ≦ m ≦ M-2, 1 ≦ n ≦ N-2) of the portion (n, m) excluding. The calculation method of the complementary value will be described below.
(4-2)補完値の算出方法
まず、最上行、最下行、最左列および最右列の画素の輝度を、画素ブロックに含まれる実画像の輝度により求める。実画像とは、カメラにより撮影された実際の画像のことである。輝度とは、画像が例えば白色の場合を1、黒色の場合を0として、対象となる画像について白色と黒色との混合比で求めた画像と比較し、一致したときの白色と黒色との混合比の値をいう。このようにして求めた、m+1行目かつ最左列の画素(0,m)の輝度をP(0,m)、m+1行目かつ最右列の画素(N-1,m)の輝度をP(N-1,m)とする。また、n+1列目かつ最上行の画素(n,0)の輝度をP(n,0)、n+1列目かつ最下行の画素(n,M-1)の輝度をP(n,M-1)とする。そのとき、n+1行目かつm+1列目の画素(n,m)の輝度の補完値P(n,m)は、 (4-2) Method of calculating complementary value First, the brightness of the pixels in the top row, bottom row, leftmost column, and rightmost column is obtained from the brightness of the actual image included in the pixel block. A real image is an actual image taken by a camera. The brightness is defined as, for example, 1 when the image is white and 0 when the image is black, and the target image is compared with the image obtained by the mixing ratio of white and black, and the mixture of white and black when they match. The value of the ratio. The brightness of the pixel (0, m) in the m + 1 row and the leftmost column obtained in this way is P (0, m), and the brightness of the pixel (N-1, m) in the m + 1 row and the rightmost column is determined. Let it be P (N-1, m). Further, the brightness of the pixel (n, 0) in the n + 1th column and the top row is P (n, 0), and the brightness of the pixel (n, M-1) in the n + 1 column and the bottom row is P (n, M-1). ). At that time, the complementary value P (n, m) of the brightness of the pixels (n, m) in the n + 1th row and the m + 1th column is
まず、最上行、最下行、最左列および最右列の画素の輝度を、画素ブロックに含まれる実画像の輝度により求める。実画像とは、カメラにより撮影された実際の画像のことである。輝度とは、画像が例えば白色の場合を1、黒色の場合を0として、対象となる画像について白色と黒色との混合比で求めた画像と比較し、一致したときの白色と黒色との混合比の値をいう。このようにして求めた、m+1行目かつ最左列の画素(0,m)の輝度をP(0,m)、m+1行目かつ最右列の画素(N-1,m)の輝度をP(N-1,m)とする。また、n+1列目かつ最上行の画素(n,0)の輝度をP(n,0)、n+1列目かつ最下行の画素(n,M-1)の輝度をP(n,M-1)とする。そのとき、n+1行目かつm+1列目の画素(n,m)の輝度の補完値P(n,m)は、 (4-2) Method of calculating complementary value First, the brightness of the pixels in the top row, bottom row, leftmost column, and rightmost column is obtained from the brightness of the actual image included in the pixel block. A real image is an actual image taken by a camera. The brightness is defined as, for example, 1 when the image is white and 0 when the image is black, and the target image is compared with the image obtained by the mixing ratio of white and black, and the mixture of white and black when they match. The value of the ratio. The brightness of the pixel (0, m) in the m + 1 row and the leftmost column obtained in this way is P (0, m), and the brightness of the pixel (N-1, m) in the m + 1 row and the rightmost column is determined. Let it be P (N-1, m). Further, the brightness of the pixel (n, 0) in the n + 1th column and the top row is P (n, 0), and the brightness of the pixel (n, M-1) in the n + 1 column and the bottom row is P (n, M-1). ). At that time, the complementary value P (n, m) of the brightness of the pixels (n, m) in the n + 1th row and the m + 1th column is
で与えられる。このようにして、P(n、m)を求めることを、補完値の算出方法という。また、画素マトリクスのうち、最外縁にある画素については、実画像の輝度をP(n、m)とする。このようにした求めた補完値P(n,m)を含む画像は、図17および図18の(b)に示される。
Given in. Obtaining P (n, m) in this way is called a method for calculating a complementary value. Further, in the pixel matrix, the brightness of the actual image is P (n, m) for the pixel at the outermost edge. The image including the complementary value P (n, m) obtained in this way is shown in FIG. 17 and FIG. 18 (b).
(4-3)突起の虚像を得る方法
次に、画素(n、m)に係る実画像(図18の(a))の輝度からP(n,m)を減算する。当該減算した値を画像にすると、図18の(c)に示す突起の虚像のみの画像となる。 (4-3) Method for Obtaining Virtual Image of Protrusion Next, P (n, m) is subtracted from the brightness of the actual image ((a) in FIG. 18) relating to the pixel (n, m). When the subtracted value is used as an image, only a virtual image of the protrusion shown in FIG. 18C is obtained.
次に、画素(n、m)に係る実画像(図18の(a))の輝度からP(n,m)を減算する。当該減算した値を画像にすると、図18の(c)に示す突起の虚像のみの画像となる。 (4-3) Method for Obtaining Virtual Image of Protrusion Next, P (n, m) is subtracted from the brightness of the actual image ((a) in FIG. 18) relating to the pixel (n, m). When the subtracted value is used as an image, only a virtual image of the protrusion shown in FIG. 18C is obtained.
(4-4)不良品の画像データの創出
図19Aは、不良品の溶接サンプルにおける、第1板B11と第2板B12とを溶接したときのビードB1の近傍の写真を示す。 (4-4) Creation of Image Data of Defective Product FIG. 19A shows a photograph of the vicinity of the bead B1 when the first plate B11 and the second plate B12 are welded in a welding sample of a defective product.
図19Aは、不良品の溶接サンプルにおける、第1板B11と第2板B12とを溶接したときのビードB1の近傍の写真を示す。 (4-4) Creation of Image Data of Defective Product FIG. 19A shows a photograph of the vicinity of the bead B1 when the first plate B11 and the second plate B12 are welded in a welding sample of a defective product.
図19Aにおいて、ビードB1の領域R1Fにある凹部D2の部分を抜き出した写真を図19Bに示す。また、(4-3)で求めた虚像F2を図19Cに示す。
In FIG. 19A, a photograph of the recess D2 in the region R1F of the bead B1 is shown in FIG. 19B. Further, the virtual image F2 obtained in (4-3) is shown in FIG. 19C.
図19Cの虚像F2の大きさを様々に変化させ、それを図19Bに示す凹部D2の写真と重ね合わせる。すると、図20A~図20Cに示す様々な大きさの虚像F2を含む画像が得られる。
The size of the virtual image F2 in FIG. 19C is variously changed, and it is superimposed on the photograph of the recess D2 shown in FIG. 19B. Then, images including virtual images F2 of various sizes shown in FIGS. 20A to 20C can be obtained.
これら図20A~図20Cに示す虚像F2を含む画像を、図19Aに示す領域R1Fに重ね合わせることにより、不良品を示す溶接の画像データが得られる。
By superimposing the images including the virtual image F2 shown in FIGS. 20A to 20C on the region R1F shown in FIG. 19A, welding image data showing a defective product can be obtained.
(4-5)不良品を示す溶接の画像データの学習
(4-4)で求めた不良品を示す溶接の画像データを、データ拡張部140が不良品データ110として取り込み、学習する。 (4-5) Learning of welding image data indicating a defective product Thedata expansion unit 140 takes in and learns the welding image data indicating a defective product obtained in (4-4) as defective product data 110.
(4-4)で求めた不良品を示す溶接の画像データを、データ拡張部140が不良品データ110として取り込み、学習する。 (4-5) Learning of welding image data indicating a defective product The
(第四の実施形態)
本開示の第四の実施形態にかかる学習データの生成方法、およびそれを用いた学習データの生成システム、ならびに学習データの生成システムを備えた追加学習要否装置について以下に説明する。また、当該追加学習要否装置を備えた溶接装置についても併せて説明する。 (Fourth Embodiment)
The learning data generation method according to the fourth embodiment of the present disclosure, the learning data generation system using the learning data generation system, and the additional learning necessity device including the learning data generation system will be described below. In addition, a welding device equipped with the additional learning necessity device will also be described.
本開示の第四の実施形態にかかる学習データの生成方法、およびそれを用いた学習データの生成システム、ならびに学習データの生成システムを備えた追加学習要否装置について以下に説明する。また、当該追加学習要否装置を備えた溶接装置についても併せて説明する。 (Fourth Embodiment)
The learning data generation method according to the fourth embodiment of the present disclosure, the learning data generation system using the learning data generation system, and the additional learning necessity device including the learning data generation system will be described below. In addition, a welding device equipped with the additional learning necessity device will also be described.
第四の実施形態にかかる追加学習要否装置300と、それを備えた溶接装置301のブロック図を図21に示す。溶接装置301は、コントローラ302によって制御される。追加学習要否装置300は、図3に示す不良品データの生成システム100を内蔵しており、コントローラ302に不良品データ110等のデータを送信する。図3に示す不良品データ110の生成システム100については、第一の実施形態に示されている。コントローラ302は、不良品データ110等のデータを受信し、当該データに基づいて溶接装置301を制御する。
FIG. 21 shows a block diagram of the additional learning necessity device 300 according to the fourth embodiment and the welding device 301 provided with the device 300. The welding device 301 is controlled by the controller 302. The additional learning necessity device 300 has a built-in defective product data generation system 100 shown in FIG. 3, and transmits data such as defective product data 110 to the controller 302. The defective product data 110 generation system 100 shown in FIG. 3 is shown in the first embodiment. The controller 302 receives data such as defective product data 110, and controls the welding device 301 based on the data.
次に、追加学習要否装置300について具体的に述べる。追加学習要否装置300は、データ生成装置101を備えている。データ生成装置101は、図3に示す生成システム100を内蔵しており、データ抽出部130およびデータ拡張部140を有する。また、データ生成装置101は中央演算処理装置(Central Processing Unit、CPU)やメモリ、通信装置を備える。
Next, the additional learning necessity device 300 will be specifically described. The additional learning necessity device 300 includes a data generation device 101. The data generation device 101 has a built-in generation system 100 shown in FIG. 3, and has a data extraction unit 130 and a data expansion unit 140. Further, the data generation device 101 includes a central processing unit (CPU), a memory, and a communication device.
データ抽出部130は所望のデータを抽出する演算装置であり、撮影部150と、データ生成部161とを有する。撮影部150は、例えば撮像素子を備えたカメラであり、教師物体120を撮像する。撮像素子としては、例えば電荷接合素子(CCD)やCMOSセンサがある。
The data extraction unit 130 is an arithmetic unit that extracts desired data, and has a photographing unit 150 and a data generation unit 161. The photographing unit 150 is, for example, a camera provided with an image sensor, and images the teacher object 120. Examples of the image pickup device include a charge junction device (CCD) and a CMOS sensor.
データ生成部161に関するブロック図を図22に示す。データ生成部161は、例えばCPUを搭載した演算装置163と、例えばハードドディスクドライブやSSD等のメモリ162とを有し、撮影部150によって撮像された教師物体120のデータ(教師物体データ160)をメモリ162に記憶する。また、データ生成部161は、第1のデータ格納部171から不良局所形状の位置情報データ170を取得する。なお、第1のデータ格納部171は、例えばハードドディスクドライブやSSD等のメモリを有し、不良局所形状の位置情報データ170を記憶している。データ生成部161は、メモリ162に記憶された教師物体データ160と不良局所形状の位置情報データ170とを組み合わせて演算し、不良局所形状データ180を生成してデータ拡張部140に送信する。
FIG. 22 shows a block diagram of the data generation unit 161. The data generation unit 161 has, for example, an arithmetic unit 163 equipped with a CPU and a memory 162 such as a hard disk drive or SSD, and data of the teacher object 120 imaged by the photographing unit 150 (teacher object data 160). Is stored in the memory 162. Further, the data generation unit 161 acquires the position information data 170 of the defective local shape from the first data storage unit 171. The first data storage unit 171 has a memory such as a hard disk drive or an SSD, and stores the position information data 170 of the defective local shape. The data generation unit 161 calculates by combining the teacher object data 160 stored in the memory 162 and the position information data 170 of the defective local shape, generates the defective local shape data 180, and transmits the defective local shape data 180 to the data expansion unit 140.
データ拡張部140は不良品データ110を生成する演算装置であり、第2のデータ格納部181と、第3のデータ格納部191と、データ組み合わせ装置111とを有する。第2のデータ格納部181は、例えばハードドディスクドライブやSSD等のメモリを有し、データ抽出部130より受信した不良局所形状データ180を記憶する。第3のデータ格納部191は、例えばハードドディスクドライブやSSD等のメモリを有し、良品データ190を格納する。
The data expansion unit 140 is an arithmetic unit that generates defective product data 110, and has a second data storage unit 181, a third data storage unit 191 and a data combination device 111. The second data storage unit 181 has a memory such as a hard disk drive or SSD, and stores the defective local shape data 180 received from the data extraction unit 130. The third data storage unit 191 has a memory such as a hard disk drive or SSD, and stores non-defective data 190.
データ組み合わせ装置111に関するブロック図を図23に示す。データ組み合わせ装置111は、例えばCPUを搭載した演算装置112と、例えばハードドディスクドライブやSSD等のデータ拡張メモリ113とを有し、不良局所形状データ180と良品データ190とを組み合わせて演算し、不良品データ110を生成する。そしてデータ組み合わせ装置111は、生成された不良品データ110をデータ拡張メモリ113に記憶させることによって不良品データ110を学習データとして学習し、コントローラ302へ送信する。
FIG. 23 shows a block diagram of the data combination device 111. The data combination device 111 has, for example, a calculation device 112 equipped with a CPU and a data expansion memory 113 such as a hard disk drive or SSD, and calculates by combining defective local shape data 180 and non-defective data 190. Defective product data 110 is generated. Then, the data combination device 111 learns the defective product data 110 as learning data by storing the generated defective product data 110 in the data expansion memory 113, and transmits the defective product data 110 to the controller 302.
コントローラ302は、中央演算装置やメモリ、通信装置を有している。コントローラ302は、データ組み合わせ装置111から不良品データ110を受け取ると溶接装置301を制御し、新たな良品データ190や不良局所形状の位置情報データ170等のデータを得る。これらのデータを再び第1のデータ格納部171や第3のデータ格納部191に送信する。
The controller 302 has a central processing unit, a memory, and a communication device. When the controller 302 receives the defective product data 110 from the data combination device 111, it controls the welding device 301 to obtain data such as new non-defective product data 190 and position information data 170 of the defective local shape. These data are transmitted to the first data storage unit 171 and the third data storage unit 191 again.
このようにして、データ生成装置101は、溶接に関するデータを取得し、不良品データ110を新たに生成する。
In this way, the data generation device 101 acquires data related to welding and newly generates defective product data 110.
そしてコントローラ302は、これらのデータに基づいて溶接装置301を制御し、溶接装置301に溶接を行わせる。追加学習要否装置300は、溶接の結果得られるビードB1のデータと不良品データ110と比較し、溶接の良・不良判定を行い、再学習をするかどうかの判断をする。
Then, the controller 302 controls the welding device 301 based on these data, and causes the welding device 301 to perform welding. The additional learning necessity device 300 compares the data of the bead B1 obtained as a result of welding with the defective product data 110, determines whether the welding is good or bad, and determines whether or not to perform re-learning.
次に、追加学習要否装置300が不良品データ110を用いて溶接物の良・不良判定をし、再学習をするかどうかの判断をすることについて、図24に示すフローチャートを用いて説明する。
Next, it will be described using the flowchart shown in FIG. 24 that the additional learning necessity device 300 determines whether the welded product is good or bad using the defective product data 110 and determines whether or not to relearn. ..
追加学習要否装置300は、溶接装置301による溶接の結果物であるビードB1のデータを得る(ステップS1)。また、追加学習要否装置300は、データ生成装置101を用いて不良品データ110を得る(ステップS2)。ここで、ステップS2において、データ生成装置101は、事前に構築したAIモデルに従って不良品データ110を生成する。なお、ここで「事前に構築したAIモデル」とは、追加学習要否装置300があらかじめ備えるAIモデルであってもよいし、例えば第3の実施形態によって直前に学習したAIモデルであってもよい。また、「AIモデル」は、不良品データ110を生成し、かつ次に述べるビードB1のデータと、得られた不良品データ110とを比較し、良・不良の推論を行うモデルである。「事前に構築したAIモデル」が、追加学習要否装置300があらかじめ備えるAIモデルである場合、不良品データ110は、あらかじめ追加学習要否装置300に備えられている。
The additional learning necessity device 300 obtains the data of the bead B1 which is the result of welding by the welding device 301 (step S1). Further, the additional learning necessity device 300 obtains defective product data 110 by using the data generation device 101 (step S2). Here, in step S2, the data generation device 101 generates defective product data 110 according to the AI model constructed in advance. Here, the "pre-built AI model" may be an AI model previously provided in the additional learning necessity device 300, or may be, for example, an AI model learned immediately before by the third embodiment. good. Further, the "AI model" is a model that generates defective product data 110, compares the bead B1 data described below with the obtained defective product data 110, and infers good or bad. When the "AI model constructed in advance" is an AI model previously provided in the additional learning necessity device 300, the defective product data 110 is provided in the additional learning necessity device 300 in advance.
次に、追加学習要否装置300は、事前に構築したAIモデルに従って得られたビードB1のデータと、得られた不良品データ110とを比較し、良・不良の推論を行う(ステップS3)。
Next, the additional learning necessity device 300 compares the data of the bead B1 obtained according to the AI model constructed in advance with the obtained defective product data 110, and infers good / bad (step S3). ..
次に、追加学習要否装置300は、ステップS3で得られた良・不良の推論について良好(Yes)か不良(No)かを判断する(ステップS4)。この良好(Yes)か不良(No)かの結果により、さらなる不良品データ110の記憶および学習をするかどうかの判断をする。
Next, the additional learning necessity device 300 determines whether the inference of good / bad obtained in step S3 is good (Yes) or bad (No) (step S4). Based on the result of good (Yes) or bad (No), it is determined whether or not to further store and learn the defective product data 110.
ステップ4での結果が良好(Yes)すなわち良品と不良品とを精度よく選別できていれば、追加学習要否装置300はデータ生成装置101に不良品データ110の生成をやめるよう指示する(学習中止、ステップS5)。
If the result in step 4 is good (Yes), that is, if the good product and the defective product can be accurately selected, the additional learning necessity device 300 instructs the data generation device 101 to stop the generation of the defective product data 110 (learning). Cancellation, step S5).
一方、ステップ4での結果が不良であれば、追加学習要否装置300はデータ生成装置101に学習(ステップS6)すなわち不良品データ110を新たに生成する(ステップS7)よう指示する。そして、追加学習要否装置300は、AIモデルの更新または新たなAIモデルの構築を行う(ステップS8、「新たなAIモデル」)。
On the other hand, if the result in step 4 is defective, the additional learning necessity device 300 instructs the data generation device 101 to learn (step S6), that is, to newly generate defective product data 110 (step S7). Then, the additional learning necessity device 300 updates the AI model or constructs a new AI model (step S8, “new AI model”).
なお、ステップS4において、本実施形態の手法によるデータ拡張で少数のデータを作成し、作成したデータに対する推論の精度が所定の値よりも低い場合に「不良」判定をしてもよい。その際に、より多数の教師データを生成するデータ拡張を行って、追加の学習(再学習)を行う(ステップS6~ステップS8)ようにしてもよい。所定の値としては、例えば溶接によって得られるビードB1の良品率すなわちビードB1の良品数をビードB1の全数で除して百分率をした値
Note that, in step S4, a small number of data may be created by data expansion by the method of the present embodiment, and a "bad" determination may be made when the accuracy of inference for the created data is lower than a predetermined value. At that time, the data may be expanded to generate a larger amount of teacher data, and additional learning (re-learning) may be performed (steps S6 to S8). As a predetermined value, for example, the non-defective product rate of the bead B1 obtained by welding, that is, the number of non-defective products of the bead B1 is divided by the total number of the bead B1 to obtain a percentage.
であってもよい。ここで、「ビードB1の全数」とは、実際に溶接をしたビードB1の数であり、「ビードB1の良品数」とは実際に溶接をしたビードB1のうち目視で良品である数である。このようにすれば、学習の後に実際に溶接を目視で確認しなくても、目視で確認したのと同様の精度で溶接の良・不良の判定ができる。
It may be. Here, the "total number of bead B1" is the number of bead B1 actually welded, and the "good number of bead B1" is the number of bead B1 actually welded that is visually good. .. In this way, it is possible to determine whether the welding is good or bad with the same accuracy as visually confirmed without actually visually confirming the welding after learning.
なお、追加学習要否装置300は、システムとしてみれば、学習データ生成システムということができる。
The additional learning necessity device 300 can be said to be a learning data generation system when viewed as a system.
次に、不良局所形状データ180の生成方法について述べる。
Next, a method for generating defective local shape data 180 will be described.
不良局所形状データ180を得るのに、以下に示す手段がある。
There are the following means to obtain the defective local shape data 180.
(第1の手段)
第1の手段は、以下に示す手段である。すなわち、撮影部150を用いてビードB1を撮影し、画像すなわち教師物体データ160を得る。教師物体データ160と、第1のデータ格納部171から取得した複数の不良局所形状の位置情報データ170とをデータ生成部161にて照合し、教師物体データ160より複数の不良局所データを抽出する。そしてこれら複数の不良局所データをデータ生成部161にて融合し、不良局所形状データ180を得る。 (First means)
The first means are the means shown below. That is, the bead B1 is photographed by using the photographingunit 150, and an image, that is, teacher object data 160 is obtained. The teacher object data 160 and the position information data 170 of a plurality of defective local shapes acquired from the first data storage unit 171 are collated by the data generation unit 161 and a plurality of defective local data are extracted from the teacher object data 160. .. Then, these plurality of defective local data are fused by the data generation unit 161 to obtain defective local shape data 180.
第1の手段は、以下に示す手段である。すなわち、撮影部150を用いてビードB1を撮影し、画像すなわち教師物体データ160を得る。教師物体データ160と、第1のデータ格納部171から取得した複数の不良局所形状の位置情報データ170とをデータ生成部161にて照合し、教師物体データ160より複数の不良局所データを抽出する。そしてこれら複数の不良局所データをデータ生成部161にて融合し、不良局所形状データ180を得る。 (First means)
The first means are the means shown below. That is, the bead B1 is photographed by using the photographing
この第1の手段を、図25を用いて説明する。すなわち、図25は、不良局所形状データ180を得る手段のブロック図である。
This first means will be described with reference to FIG. That is, FIG. 25 is a block diagram of means for obtaining defective local shape data 180.
撮影部150においてビードB1を撮影し、教師物体データ160を得る。得られた教師物体データ160は、例えば図11に示す画像である。次に、データ生成部161において、教師物体データ160と、複数の不良局所形状の位置情報データ170とを照合し、不良局所データαi(iは1からnまでのいずれかの整数)を得る。不良局所データαiは、例えば図12A~図12C、図13A~図13Dおよび図14A~図14Dのいずれかに示す画像である。そしてデータ生成部161において不良局所データα1、α2、・・・、αnを融合し、不良局所形状データ180を得る。
The bead B1 is photographed by the photographing unit 150, and the teacher object data 160 is obtained. The obtained teacher object data 160 is, for example, an image shown in FIG. Next, the data generation unit 161 collates the teacher object data 160 with the position information data 170 of a plurality of defective local shapes, and obtains defective local data αi (i is an integer from 1 to n). The defective local data αi is, for example, an image shown in any of FIGS. 12A to 12C, 13A to 13D, and 14A to 14D. Then, the data generation unit 161 fuses the defective local data α1, α2, ..., αn to obtain the defective local shape data 180.
(第2の手段)
第2の手段は、以下に示す手段である。すなわち、撮影部150の撮影条件(例えば撮影部150の位置や撮影に用いる光源の発光強度等)を様々に変更して複数の撮影条件を設定し、それぞれ異なる撮影条件でもってビードB1を撮影して複数の画像すなわち複数の教師物体データ160を得る。なお、教師物体データ160は、教師物体120の一部領域の画像であり得る。当該複数の教師物体データ160と、第1のデータ格納部171から取得した不良局所形状の位置情報データ170とをデータ生成部161にて照合し、複数の教師物体データ160より複数の不良局所データを抽出する。そしてこれら複数の不良局所データをデータ生成部161にて融合し、不良局所形状データ180を得る。 (Second means)
The second means is the means shown below. That is, a plurality of shooting conditions are set by variously changing the shooting conditions of the shooting unit 150 (for example, the position of theshooting unit 150, the emission intensity of the light source used for shooting, etc.), and the bead B1 is shot under different shooting conditions. To obtain a plurality of images, that is, a plurality of teacher object data 160. The teacher object data 160 may be an image of a part of the area of the teacher object 120. The plurality of teacher object data 160 and the position information data 170 of the defective local shape acquired from the first data storage unit 171 are collated by the data generation unit 161, and a plurality of defective local data from the plurality of teacher object data 160 are collated. Is extracted. Then, these plurality of defective local data are fused by the data generation unit 161 to obtain defective local shape data 180.
第2の手段は、以下に示す手段である。すなわち、撮影部150の撮影条件(例えば撮影部150の位置や撮影に用いる光源の発光強度等)を様々に変更して複数の撮影条件を設定し、それぞれ異なる撮影条件でもってビードB1を撮影して複数の画像すなわち複数の教師物体データ160を得る。なお、教師物体データ160は、教師物体120の一部領域の画像であり得る。当該複数の教師物体データ160と、第1のデータ格納部171から取得した不良局所形状の位置情報データ170とをデータ生成部161にて照合し、複数の教師物体データ160より複数の不良局所データを抽出する。そしてこれら複数の不良局所データをデータ生成部161にて融合し、不良局所形状データ180を得る。 (Second means)
The second means is the means shown below. That is, a plurality of shooting conditions are set by variously changing the shooting conditions of the shooting unit 150 (for example, the position of the
この第2の手段を、図26を用いて説明する。すなわち、図26は、不良局所形状データ180を得る手段のブロック図である。
This second means will be described with reference to FIG. That is, FIG. 26 is a block diagram of means for obtaining defective local shape data 180.
撮影部150において、不良局所形状の位置情報データ170がn個あるとし、撮影条件をn通り設定する。なお、nは自然数である。また、撮影条件は、例えば撮影領域である。このn通りの条件によって撮影された複数の教師物体データ160をそれぞれβ1、β2、・・・、βnとする。データβi(iは1からnまでのいずれかの整数)は、例えば図12A~図12C、図13A~図13Dおよび図14A~図14Dのいずれかに示す画像である。次に、データ生成部161において、各々のデータβi(iは1からnまでのいずれかの整数)について不良局所形状の位置情報データ170とを照合する。そしてデータ生成部161においてデータβ1、β2、・・・、δnを融合し、不良局所形状データ180を得る。
In the photographing unit 150, it is assumed that there are n position information data 170 of the defective local shape, and the photographing conditions are set in n ways. Note that n is a natural number. The shooting condition is, for example, a shooting area. Let β1, β2, ..., Βn be the plurality of teacher object data 160 captured under these n conditions, respectively. The data βi (i is any integer from 1 to n) is, for example, an image shown in any of FIGS. 12A-12C, 13A-13D and 14A-14D. Next, the data generation unit 161 collates each data βi (i is an integer from 1 to n) with the position information data 170 of the defective local shape. Then, the data generation unit 161 fuses the data β1, β2, ..., δn to obtain defective local shape data 180.
なお、上記第1の手段、第2の手段において、n=1の場合も可能である。そのときは、第1の手段と第2の手段とは同じである。
It should be noted that in the first and second means described above, the case of n = 1 is also possible. At that time, the first means and the second means are the same.
上記実施形態にかかる学習データの生成方法、およびそれを用いた学習データの生成システム、ならびに学習データの生成システムを備えた追加学習要否装置300は、事前にAiモデルの学習を行っていても、顧客環境で撮影される形状が事前に使っていたデータと大きく異なる場合に、特に有用である。
Even if the additional learning necessity device 300 provided with the learning data generation method according to the above embodiment, the learning data generation system using the learning data generation system, and the learning data generation system has trained the Ai model in advance. This is especially useful when the shape taken in the customer environment is significantly different from the data used in advance.
(第五の実施形態)
本開示の第五の実施形態にかかる学習データの生成方法、およびそれを用いた学習データの生成システム、ならびに学習データの生成システムを備えた追加学習要否装置について以下に説明する。また、当該追加学習要否装置を備えた溶接装置についても併せて説明する。 (Fifth Embodiment)
The learning data generation method according to the fifth embodiment of the present disclosure, the learning data generation system using the learning data generation system, and the additional learning necessity device including the learning data generation system will be described below. In addition, a welding device equipped with the additional learning necessity device will also be described.
本開示の第五の実施形態にかかる学習データの生成方法、およびそれを用いた学習データの生成システム、ならびに学習データの生成システムを備えた追加学習要否装置について以下に説明する。また、当該追加学習要否装置を備えた溶接装置についても併せて説明する。 (Fifth Embodiment)
The learning data generation method according to the fifth embodiment of the present disclosure, the learning data generation system using the learning data generation system, and the additional learning necessity device including the learning data generation system will be described below. In addition, a welding device equipped with the additional learning necessity device will also be described.
第五の実施形態にかかる追加学習要否装置400と、それを備えた溶接装置401のブロック図を図27に示す。溶接装置401は、コントローラ402によって制御される。追加学習要否装置400は、図6に示す不良品データの生成システム200を内蔵しており、コントローラ402に良品データ210等のデータを送信する。図6に示す良品データ210の生成システム200については、第二の実施形態に示されている。コントローラ402は、良品データ210等のデータを受信し、当該データに基づいて溶接装置401を制御する。
FIG. 27 shows a block diagram of the additional learning necessity device 400 according to the fifth embodiment and the welding device 401 provided with the device 400. The welding device 401 is controlled by the controller 402. The additional learning necessity device 400 has a built-in defective product data generation system 200 shown in FIG. 6, and transmits data such as non-defective product data 210 to the controller 402. The non-defective data 210 generation system 200 shown in FIG. 6 is shown in the second embodiment. The controller 402 receives data such as non-defective product data 210 and controls the welding device 401 based on the data.
次に、追加学習要否装置400について具体的に述べる。追加学習要否装置400は、データ生成装置201を備えている。データ生成装置201は、図6に示す生成システム200を内蔵しており、データ抽出部230およびデータ拡張部240を有する。また、データ生成装置201はCPUやメモリ、通信装置を備える。
Next, the additional learning necessity device 400 will be specifically described. The additional learning necessity device 400 includes a data generation device 201. The data generation device 201 has a built-in generation system 200 shown in FIG. 6, and has a data extraction unit 230 and a data expansion unit 240. Further, the data generation device 201 includes a CPU, a memory, and a communication device.
データ抽出部230は所望のデータを抽出する演算装置であり、撮影部250と、データ生成部261とを有する。撮影部250は、例えば撮像素子を備えたカメラであり、教師物体220を撮像する。撮像素子としては、例えば電荷接合素子(CCD)やCMOSセンサがある。
The data extraction unit 230 is an arithmetic unit that extracts desired data, and has a photographing unit 250 and a data generation unit 261. The photographing unit 250 is, for example, a camera provided with an image sensor, and images the teacher object 220. Examples of the image pickup device include a charge junction device (CCD) and a CMOS sensor.
データ生成部261に関するブロック図を図28に示す。データ生成部261は、例えばCPUを搭載した演算装置263と、例えばハードドディスクドライブやSSD等のメモリ262を有する。メモリ262は、撮影部250によって撮像された教師物体220のデータ(教師物体データ260)を記憶する。また、データ生成部261は、第1のデータ格納部271から不良局所形状の位置情報データ270を取得する。なお、第1のデータ格納部271は、例えばハードドディスクドライブやSSD等のメモリを有し、不良局所形状の位置情報データ270を記憶している。データ生成部261は、メモリ262が記憶した教師物体データ260と不良局所形状の位置情報データ270とを組み合わせて演算し、良品局所形状データ280を生成してデータ拡張部240に送信する。
FIG. 28 shows a block diagram of the data generation unit 261. The data generation unit 261 has, for example, an arithmetic unit 263 equipped with a CPU and a memory 262 such as a hard disk drive or SSD. The memory 262 stores the data (teacher object data 260) of the teacher object 220 imaged by the photographing unit 250. Further, the data generation unit 261 acquires the position information data 270 of the defective local shape from the first data storage unit 271. The first data storage unit 271 has, for example, a memory such as a hard disk drive or SSD, and stores position information data 270 of a defective local shape. The data generation unit 261 calculates by combining the teacher object data 260 stored in the memory 262 and the position information data 270 of the defective local shape, generates the non-defective local shape data 280, and transmits it to the data expansion unit 240.
データ拡張部240は良品データ210を生成する演算装置であり、第2のデータ格納部281と、第3のデータ格納部291と、データ組み合わせ装置211とを有する。第2のデータ格納部281は、例えばハードドディスクドライブやSSD等のメモリを有し、データ抽出部230より受信した良品局所形状データ280を記憶する。第3のデータ格納部291は、例えばハードドディスクドライブやSSD等のメモリを有し、良品データ290を格納する。
The data expansion unit 240 is an arithmetic unit that generates non-defective data 210, and has a second data storage unit 281, a third data storage unit 291 and a data combination device 211. The second data storage unit 281 has a memory such as a hard disk drive or SSD, and stores non-defective local shape data 280 received from the data extraction unit 230. The third data storage unit 291 has a memory such as a hard disk drive or SSD, and stores non-defective data 290.
データ組み合わせ装置211に関するブロック図を図29に示す。データ組み合わせ装置211は、例えばCPUを搭載した演算装置212と、例えばハードドディスクドライブやSSD等のデータ拡張メモリ213とを有し、良品局所形状データ280と良品データ290とを組み合わせて演算し、良品データ210を生成する。そしてデータ組み合わせ装置211は、生成された良品データ210をデータ拡張メモリ213に記憶させることによって良品データ210を学習データとして学習し、コントローラ402へ送信する。
FIG. 29 shows a block diagram of the data combination device 211. The data combination device 211 has, for example, a calculation device 212 equipped with a CPU and a data expansion memory 213 such as a hard disk drive or SSD, and calculates by combining non-defective local shape data 280 and non-defective data 290. Generate non-defective product data 210. Then, the data combination device 211 learns the non-defective product data 210 as learning data by storing the generated non-defective product data 210 in the data expansion memory 213, and transmits the non-defective product data 210 to the controller 402.
コントローラ402は、中央演算装置やメモリ、通信装置を有している。コントローラ402は、データ組み合わせ装置211から良品データ210を受け取ると溶接装置401を制御し、新たな良品データ290や不良局所形状の位置情報データ270等のデータを得る。これらのデータを再び第1のデータ格納部271や第3のデータ格納部291に送信する。
The controller 402 has a central processing unit, a memory, and a communication device. When the controller 402 receives the non-defective product data 210 from the data combination device 211, it controls the welding device 401 to obtain data such as new non-defective product data 290 and position information data 270 of the defective local shape. These data are transmitted to the first data storage unit 271 and the third data storage unit 291 again.
このようにして、データ生成装置201は、溶接に関するデータを取得し、良品データ210を新たに生成する。
In this way, the data generation device 201 acquires data related to welding and newly generates non-defective product data 210.
そして追加学習要否装置400は、これらのデータに基づいて溶接装置401にて溶接を行わせる。追加学習要否装置400は、溶接の結果得られるビードB1と良品データ210と比較し、溶接の良・不良判定を行い、再学習をするかどうかの判断をする。追加学習要否装置400が不良品データ110を用いて溶接物の良・不良判定をし、再学習をするかどうかの判断をすることについては、第四の実施形態と同様である。すなわち、良・不良判定の結果が良好すなわち良品と不良品とを精度よく選別できていれば、追加学習要否装置400はデータ生成装置201に良品データ210の生成をやめるよう指示する。また、良・不良判定の結果が不良であれば、追加学習要否装置400はデータ生成装置201に良品データ210を新たに生成すよう指示する。
Then, the additional learning necessity device 400 causes the welding device 401 to perform welding based on these data. The additional learning necessity device 400 compares the bead B1 obtained as a result of welding with the non-defective product data 210, determines whether the welding is good or bad, and determines whether or not to relearn. The additional learning necessity device 400 uses the defective product data 110 to determine whether the welded product is good or bad, and determines whether or not to relearn, as in the fourth embodiment. That is, if the result of the good / bad determination is good, that is, if the good product and the defective product can be accurately selected, the additional learning necessity device 400 instructs the data generation device 201 to stop the generation of the good product data 210. If the result of the good / bad determination is bad, the additional learning necessity device 400 instructs the data generation device 201 to newly generate the good product data 210.
なお、追加学習要否装置400は、システムとしてみれば、学習データ生成システムということができる。
The additional learning necessity device 400 can be said to be a learning data generation system when viewed as a system.
なお、不良局所形状データ180の生成方法については、第四の実施形態と同じである。
The method of generating the defective local shape data 180 is the same as that of the fourth embodiment.
(第六の実施形態)
本開示の第六の実施形態にかかる追加学習要否装置300およびそれを備えた溶接装置301について説明する。 (Sixth Embodiment)
The additionallearning necessity device 300 and the welding device 301 provided with the additional learning necessity device 300 according to the sixth embodiment of the present disclosure will be described.
本開示の第六の実施形態にかかる追加学習要否装置300およびそれを備えた溶接装置301について説明する。 (Sixth Embodiment)
The additional
この追加学習要否装置300およびそれを備えた溶接装置301の概略を図30に示す。図30において、溶接装置301は、追加学習要否装置300を内蔵する。追加学習要否装置300は、コントローラ302を内蔵している。すなわち、溶接装置301は、追加学習要否装置300およびコントローラ302を内蔵しており、この点が第五の実施形態にかかる溶接装置301と異なる。
FIG. 30 shows an outline of the additional learning necessity device 300 and the welding device 301 equipped with the device 300. In FIG. 30, the welding device 301 incorporates an additional learning necessity device 300. The additional learning necessity device 300 has a built-in controller 302. That is, the welding device 301 has a built-in additional learning necessity device 300 and a controller 302, which is different from the welding device 301 according to the fifth embodiment.
追加学習要否装置300およびコントローラ302の機能は、第五の実施形態に示される追加学習要否装置300およびコントローラ302と同様である。
The functions of the additional learning necessity device 300 and the controller 302 are the same as those of the additional learning necessity device 300 and the controller 302 shown in the fifth embodiment.
(第七の実施形態)
本開示の第七の実施形態にかかる追加学習要否装置300およびそれを備えた成形装置501について説明する。 (Seventh Embodiment)
The additionallearning necessity device 300 and the molding device 501 provided with the additional learning necessity device 300 according to the seventh embodiment of the present disclosure will be described.
本開示の第七の実施形態にかかる追加学習要否装置300およびそれを備えた成形装置501について説明する。 (Seventh Embodiment)
The additional
この追加学習要否装置300およびそれを備えた成形装置501の概略を図31に示す。成形装置501は、例えばプラスチック製品を射出成型する。すなわち、成形装置501は、溶接装置301が金属を用いて溶接する代わりに、プラスチック原料を用いて製品を製造する。すなわち、溶接の代わりに、例えばプラスチック製品を射出成型して製造する場合においても追加学習要否装置300を適用することができる。ここで、不良局所形状の位置情報データ170や良品データ190は、製造されたプラスチック製品より得ることができる。
FIG. 31 shows an outline of the additional learning necessity device 300 and the molding device 501 equipped with the device 300. The molding device 501, for example, injection-molds a plastic product. That is, the molding apparatus 501 manufactures a product using a plastic raw material instead of the welding apparatus 301 welding using metal. That is, instead of welding, the additional learning necessity device 300 can be applied even in the case of manufacturing, for example, by injection molding a plastic product. Here, the position information data 170 of the defective local shape and the non-defective product data 190 can be obtained from the manufactured plastic product.
なお、第四、第六および第七の実施形態においては溶接やプラスチック製品の射出成型に対して追加学習要否装置300を適用したが、追加学習要否装置300はそれ以外にも適用可能であり、様々な材料よりなる製品の塗装や接着等に適用することができる。また、第五の実施形態においては溶接に対して追加学習要否装置400を適用したが、追加学習要否装置400はそれ以外にも適用可能であり、例えばプラスチック製品の射出成型や様々な材料よりなる製品の塗装や接着等に適用することができる。
In the fourth, sixth and seventh embodiments, the additional learning necessity device 300 is applied to welding and injection molding of plastic products, but the additional learning necessity device 300 can be applied to other than that. It can be applied to the painting and adhesion of products made of various materials. Further, in the fifth embodiment, the additional learning necessity device 400 is applied to welding, but the additional learning necessity device 400 can be applied to other than that, for example, injection molding of plastic products and various materials. It can be applied to the painting and adhesion of various products.
また、上記第四~第七の実施形態において、教師物体データ160、260、不良局所形状の位置情報データ170、270、不良品データ110、良品データ190、290等のデータは、追加学習要否装置300、400にあるメモリに格納されていてもよい。また、上記第四~第七の実施形態において、教師物体データ160、260、不良局所形状の位置情報データ170、270、不良品データ110、良品データ190、290等は、外部のサーバに格納されていてもよい。また、上記第四~第七の実施形態において、上記データの演算や処理を、いわゆるクラウドコンピューティングにより行ってもよい。
Further, in the fourth to seventh embodiments, additional learning is required for the data such as the teacher object data 160, 260, the position information data 170, 270 of the defective local shape, the defective product data 110, and the non-defective product data 190, 290. It may be stored in the memory in the devices 300 and 400. Further, in the fourth to seventh embodiments, the teacher object data 160, 260, the position information data 170, 270 of the defective local shape, the defective product data 110, the non-defective product data 190, 290, etc. are stored in the external server. You may be. Further, in the fourth to seventh embodiments, the calculation and processing of the data may be performed by so-called cloud computing.
(他の実施形態)
以上、本開示の実施の形態に係る学習データの生成方法について、実施の形態に基づいて説明したが、本開示は、これらの実施の形態に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が思いつく各種変形を各実施の形態に施したもの、又は、異なる実施の形態における構成要素を組み合わせて構築される形態も、一つ又は複数の態様の範囲内に含まれても良い。 (Other embodiments)
The method of generating learning data according to the embodiment of the present disclosure has been described above based on the embodiment, but the present disclosure is not limited to these embodiments. As long as the gist of the present disclosure is not deviated, various modifications that can be conceived by those skilled in the art are applied to each embodiment, or a form constructed by combining components in different embodiments is also a form of one or a plurality of embodiments. It may be included in the range.
以上、本開示の実施の形態に係る学習データの生成方法について、実施の形態に基づいて説明したが、本開示は、これらの実施の形態に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が思いつく各種変形を各実施の形態に施したもの、又は、異なる実施の形態における構成要素を組み合わせて構築される形態も、一つ又は複数の態様の範囲内に含まれても良い。 (Other embodiments)
The method of generating learning data according to the embodiment of the present disclosure has been described above based on the embodiment, but the present disclosure is not limited to these embodiments. As long as the gist of the present disclosure is not deviated, various modifications that can be conceived by those skilled in the art are applied to each embodiment, or a form constructed by combining components in different embodiments is also a form of one or a plurality of embodiments. It may be included in the range.
本開示によれば、教師物体を撮影したデータに対して、事前に構築したAIモデルで正しく推論結果が得られなかった場合に、本手法(重畳)によるデータ拡張をし、追加の学習(再学習)を行うことができる。
According to the present disclosure, when the inference result is not correctly obtained by the AI model constructed in advance for the data obtained by photographing the teacher object, the data is expanded by this method (superimposition) and additional learning (re-learning) is performed. Learning) can be done.
また、本開示によれば、本実施形態によるデータ拡張で少数のデータを作成し、作成したデータに対する推論の精度が所定の値よりも低かった場合に、より多数の教師データを生成するデータ拡張を行って、追加の学習(再学習)を行うことができる。
Further, according to the present disclosure, a data extension that creates a small number of data by the data extension according to the present embodiment and generates a larger number of teacher data when the accuracy of inference for the created data is lower than a predetermined value. Can be performed for additional learning (re-learning).
(実施の態様)
第1の実施の態様にかかるデータ生成システム(100、200)は、データ抽出部(130)と、データ拡張部(140)と、を備える。データ抽出部(130)は、不良局所形状を有する物体(120、220)のデータ(160、260)から不良局所形状に関する第1のデータ(180、280)を抽出する。具体的には、データ抽出部(130)は、不良局所形状を有する物体(120、220)のデータ(160、260)と不良局所形状の位置情報データ(170、270)とに基づいて、第1のデータ(180、280)を抽出する。データ拡張部(140、240)は、第1のデータ(180、280)と、不良局所形状を有さない物体の第2のデータ(190、290)とを組み合わせる。 (Implementation mode)
The data generation system (100, 200) according to the first embodiment includes a data extraction unit (130) and a data expansion unit (140). The data extraction unit (130) extracts the first data (180, 280) regarding the defective local shape from the data (160, 260) of the objects (120, 220) having the defective local shape. Specifically, the data extraction unit (130) is based on the data (160, 260) of the objects (120, 220) having the defective local shape and the position information data (170, 270) of the defective local shape. 1 data (180, 280) is extracted. The data extension unit (140, 240) combines the first data (180, 280) with the second data (190, 290) of an object having no defective local shape.
第1の実施の態様にかかるデータ生成システム(100、200)は、データ抽出部(130)と、データ拡張部(140)と、を備える。データ抽出部(130)は、不良局所形状を有する物体(120、220)のデータ(160、260)から不良局所形状に関する第1のデータ(180、280)を抽出する。具体的には、データ抽出部(130)は、不良局所形状を有する物体(120、220)のデータ(160、260)と不良局所形状の位置情報データ(170、270)とに基づいて、第1のデータ(180、280)を抽出する。データ拡張部(140、240)は、第1のデータ(180、280)と、不良局所形状を有さない物体の第2のデータ(190、290)とを組み合わせる。 (Implementation mode)
The data generation system (100, 200) according to the first embodiment includes a data extraction unit (130) and a data expansion unit (140). The data extraction unit (130) extracts the first data (180, 280) regarding the defective local shape from the data (160, 260) of the objects (120, 220) having the defective local shape. Specifically, the data extraction unit (130) is based on the data (160, 260) of the objects (120, 220) having the defective local shape and the position information data (170, 270) of the defective local shape. 1 data (180, 280) is extracted. The data extension unit (140, 240) combines the first data (180, 280) with the second data (190, 290) of an object having no defective local shape.
第2の実施の態様にかかるデータ生成システム(100、200)は、第一の実施の態様にかかるデータ生成システム(100、200)において、物体(120、220)は、複数の不良局所形状を有する。
In the data generation system (100, 200) according to the second embodiment, the object (120, 220) has a plurality of defective local shapes in the data generation system (100, 200) according to the first embodiment. Have.
第3の実施の態様にかかるデータ生成システム(100、200)は、第一の実施の態様または第二の実施の態様にかかるデータ生成システム(100、200)において、物体(120、220)は、複数の素材で形成されている。
The data generation system (100, 200) according to the third embodiment is the object (120, 220) in the data generation system (100, 200) according to the first embodiment or the second embodiment. , It is made of multiple materials.
第4の実施の態様にかかるデータ生成システム(100、200)は、第一から第三の実施の態様にかかるデータ生成システム(100、200)において、データ抽出部(130)は、物体(120、220)を撮影する撮影部(150、250)を備える。
In the data generation system (100, 200) according to the fourth embodiment, in the data generation system (100, 200) according to the first to third embodiments, the data extraction unit (130) is an object (120). , 220).
第5の実施の態様にかかるデータ生成システム(100、200)は、第四の実施の態様にかかるデータ生成システム(100、200)において、撮影部(150、250)は、物体(120、220)を複数の撮影条件で撮影する。
In the data generation system (100, 200) according to the fifth embodiment, in the data generation system (100, 200) according to the fourth embodiment, the photographing unit (150, 250) is an object (120, 220). ) Is shot under multiple shooting conditions.
第6の実施の態様にかかるデータ生成システム(100、200)は、第一から第五の実施の態様にかかるデータ生成システム(100,200)において、第1のデータ(180、280)および第2のデータ(190、290)は、画像情報である。
The data generation system (100, 200) according to the sixth embodiment is the first data (180, 280) and the first data (180, 280) in the data generation system (100, 200) according to the first to fifth embodiments. The data (190, 290) of No. 2 is image information.
第7の実施の態様にかかるデータ生成システム(100、200)は、第一から第六の実施の態様にかかるデータ生成システム(100、200)において、第1のデータ(180、280)と第2のデータ(190、290)の組み合わせは、複数のパラメータに基づくデータ拡張によって行われる。
The data generation system (100, 200) according to the seventh embodiment is the first data (180, 280) and the first data (180, 280) in the data generation system (100, 200) according to the first to sixth embodiments. The combination of the two data (190, 290) is done by data expansion based on a plurality of parameters.
第8の態様にかかるデータ生成方法は、不良局所形状を有する物体(120、220)のデータ(160、260)から不良局所形状に関する第1のデータ(180、280)を抽出するステップと、第1のデータ(180、280)と、不良局所形状を有さない物体の第2のデータ(190、290)とを組み合わせるステップと、有する。抽出するステップは、不良局所形状を有する物体(120、220)のデータ(160、260)と不良局所形状の位置情報データ(170、270)とに基づいて、第1のデータ(180,280)を抽出するステップを含む。
The data generation method according to the eighth aspect includes a step of extracting the first data (180, 280) regarding the defective local shape from the data (160, 260) of the objects (120, 220) having the defective local shape, and the first step. It has a step of combining the data of 1 (180, 280) and the second data (190, 290) of an object having no defective local shape. The extraction step is based on the data (160, 260) of the object (120, 220) having the defective local shape and the position information data (170, 270) of the defective local shape, and the first data (180, 280). Includes steps to extract.
第9の態様にかかるデータ生成方法は、第8の態様にかかるデータ生成方法において、物体(120、220)は、複数の不良局所形状を有する。
In the data generation method according to the ninth aspect, in the data generation method according to the eighth aspect, the object (120, 220) has a plurality of defective local shapes.
第10の態様にかかるデータ生成方法は、第8または第9の態様にかかるデータ生成方法において、物体(120、220)は、複数の素材で形成されている。
In the data generation method according to the tenth aspect, in the data generation method according to the eighth or ninth aspect, the object (120, 220) is formed of a plurality of materials.
第11の態様にかかるデータ生成方法は、第8から第10の態様にかかるデータ生成方法において、データ抽出部(130)は、物体(120、220)を撮影する撮影部(150、250)を備える。
The data generation method according to the eleventh aspect is the data generation method according to the eighth to tenth aspects. Be prepared.
第12の態様にかかるデータ生成方法は、第11の態様にかかるデータ生成方法において、撮影部(150、250)は、物体(120、220)を複数の撮影条件で撮影する。
In the data generation method according to the twelfth aspect, in the data generation method according to the eleventh aspect, the photographing unit (150, 250) photographs the object (120, 220) under a plurality of photographing conditions.
第13の態様にかかるデータ生成方法は、第8から第12の態様にかかるデータ生成方法において、第1のデータ(180、280)および第2のデータ(190、290)は、画像情報である。
The data generation method according to the thirteenth aspect is the data generation method according to the eighth to twelfth aspects, in which the first data (180, 280) and the second data (190, 290) are image information. ..
第14の態様にかかるデータ生成方法は、第8から第13の態様にかかるデータ生成方法において、第1のデータ(180、280)と第2のデータ(190、290)の組み合わせは、複数のパラメータに基づくデータ拡張によって行われる。
The data generation method according to the fourteenth aspect is the data generation method according to the eighth to thirteenth aspects, wherein the combination of the first data (180, 280) and the second data (190, 290) is plural. It is done by parameter-based data expansion.
第15の態様にかかるデータ生成方法は、物体(120、220)の局所的特徴に割り当てられたクラスを判定するための学習用データ(110、210)を生成する方法である。当該方法は、クラスの局所特徴を有するサンプル物体(120、220)と、物体(120、220)における局所特徴が含まれる位置と範囲を示す位置情報(170、270)とにより、サンプル物体(120、220)を撮影して得られた局所特徴(180、280)を用いて学習用データ(110、210)を生成する。
The data generation method according to the fifteenth aspect is a method of generating learning data (110, 210) for determining a class assigned to a local feature of an object (120, 220). The method is based on a sample object (120, 220) having a class of local features and a sample object (120, 270) indicating the position and range of the object (120, 220) including the local features. , 220), and the learning data (110, 210) is generated using the local features (180, 280) obtained.
第16の実施態様にかかるデータ生成装置(101)は、抽出部(130)と、データ拡張部(140)と、を備える。抽出部(130)は、不良局所形状を有する物体(120)から第1データ(180)を抽出する。抽出部(130)は、生成部(161)を含む。生成部(161)は、物体(120)から不良局所形状に関する複数のデータを抽出する。生成部(161)は、この複数のデータを融合して第1データ(180)を生成する。データ拡張部(140)は、組み合わせ部(111)を含む。組み合わせ部(111)は、第1データ(180)と、不良局所形状を有さない物体の第2データ(190)とを組み合わせる。
The data generation device (101) according to the sixteenth embodiment includes an extraction unit (130) and a data expansion unit (140). The extraction unit (130) extracts the first data (180) from the object (120) having a defective local shape. The extraction unit (130) includes a generation unit (161). The generation unit (161) extracts a plurality of data regarding the defective local shape from the object (120). The generation unit (161) fuses the plurality of data to generate the first data (180). The data expansion unit (140) includes a combination unit (111). The combination unit (111) combines the first data (180) and the second data (190) of an object having no defective local shape.
第17の実施態様にかかるデータ生成装置(101)は、抽出部(130)と、データ拡張部(140)と、を備える。抽出部(130)は、不良局所形状を有する物体(120)から第1データ(180)を抽出する。抽出部(130)は、生成部(161)を含む。生成部(161)は、複数の撮影条件で撮影したデータ(160)を融合して第1データ(180)を生成する。データ拡張部(140)は、組み合わせ部(111)を含む。組み合わせ部(111)は、第1データ(180)と、不良局所形状を有さない物体の第2データ(190)とを組み合わせる。
The data generation device (101) according to the seventeenth embodiment includes an extraction unit (130) and a data expansion unit (140). The extraction unit (130) extracts the first data (180) from the object (120) having a defective local shape. The extraction unit (130) includes a generation unit (161). The generation unit (161) generates the first data (180) by fusing the data (160) taken under a plurality of shooting conditions. The data expansion unit (140) includes a combination unit (111). The combination unit (111) combines the first data (180) and the second data (190) of an object having no defective local shape.
第18の実施態様にかかる追加学習要否装置は、第16または第17に記載のデータ生成装置(101)によって生成したデータ(110)を事前に構築したAIモデルで推論を行い、正しく推論結果が得られなかった場合に追加学習を行う判定をする。
The additional learning necessity device according to the eighteenth embodiment infers the data (110) generated by the data generation device (101) according to the sixteenth or seventeenth embodiment using an AI model constructed in advance, and correctly infers the result. Is determined to perform additional learning when is not obtained.
第19の実施態様にかかる追加学習要否装置は、第18の実施態様にかかる追加学習装置において、少数のデータを作成し、作成したデータに対する推論の精度が所定の値よりも低い場合に、より多数の教師データを生成するデータ拡張を行って、追加学習を行う。
The additional learning necessity device according to the nineteenth embodiment creates a small number of data in the additional learning device according to the eighteenth embodiment, and when the accuracy of inference for the created data is lower than a predetermined value, Perform additional learning by performing data expansion that generates more teacher data.
第20の態様にかかる追加学習要否装置は、第16または第17にかかるデータ生成装置(110)を備え、かつ対象物のデータ(B1)を得るステップ(S1)と、第1データ(180)を得るステップ(S2)と、対象物のデータ(B1)と第1データ(180)とを比較し、良・不良の推論を行うステップ(S3)と、良・不良の推論について良好か不良かを判断するステップ(S4)と、良好か不良かの結果により、さらなる第1データ(180)の記憶および学習をするかどうかの判断をし、良好な結果の場合追加学習を中止するステップ(S5)と、結果が不良の場合、データ生成装置(110)に対し第1データを新たに生成する(S7)よう指示し(S6)、かつAIモデルの更新または新たなAIモデルの構築を行うステップ(S8)と、を有する。
The additional learning necessity device according to the twentieth aspect includes the data generation device (110) according to the sixteenth or seventeenth, and the step (S1) of obtaining the data (B1) of the object and the first data (180). ), The data (B1) of the object and the first data (180) are compared, and the step (S3) of inferring good / bad is good or bad. Based on the step (S4) of determining whether or not the data is good or bad, it is determined whether or not to further store and learn the first data (180), and if the result is good, the additional learning is stopped (step). S5) and, if the result is poor, instruct the data generator (110) to newly generate the first data (S7) (S6), and update the AI model or build a new AI model. It has a step (S8) and.
本開示のデータ生成システム、データ生成方法、データ生成装置および追加学習要否装置は、例えば、溶接が正しく行われたか否かを検査する溶接外観検査などに適用でき、産業上有用である。
The data generation system, data generation method, data generation device, and additional learning necessity device of the present disclosure can be applied to, for example, a welding appearance inspection for inspecting whether or not welding is performed correctly, and are industrially useful.
100、200 生成システム
101、201 データ生成装置
110 不良品データ
111、211データ組み合わせ装置
112、212、163、263 演算装置
113、213、162、262 メモリ
120、220 教師物体
130、230 データ抽出部
140、240 データ拡張部
150、250 撮影部
160、260 教師物体データ
170、270 不良局所形状の位置情報データ
171、271 第1のデータ格納部
180 不良局所形状データ
181、281 第2のデータ格納部
190、210、290 良品データ
191、291 第3のデータ格納部
280 良品局所形状データ
300、400 追加学習要否装置
301、401 溶接装置
302、402 コントローラ
501 成形装置
B1、B1a、B1b ビード
B11 第1板
B12、B12a、B12b 第2板
B2 アンダーカット
B3 ピット
B4 スパッタ
C1 突起
D1 ピット
D2、D2A、D2B 凹部
E1 ホール
F2、F2A、F2B 虚像
R1、R1A、R1B、R1C、R1D、R1E、R1F 領域 100, 200 Generation system 101, 201 Data generation device 110 Defective product data 111, 211 Data combination device 112, 212, 163, 263 Computing device 113, 213, 162, 262 Memory 120, 220 Teacher object 130, 230 Data extraction unit 140 , 240 Data expansion unit 150, 250 Imaging unit 160, 260 Teacher object data 170, 270 Poor local shape position information data 171, 271 First data storage unit 180 Defective local shape data 181, 281 Second data storage unit 190 , 210, 290 Good product data 191, 291 Third data storage unit 280 Good product local shape data 300, 400 Additional learning necessity device 301, 401 Welding device 302, 402 Controller 501 Molding device B1, B1a, B1b Bead B11 First plate B12, B12a, B12b 2nd plate B2 undercut B3 pit B4 spatter C1 protrusion D1 pit D2, D2A, D2B recess E1 hole F2, F2A, F2B imaginary image R1, R1A, R1B, R1C, R1D, R1E, R1
101、201 データ生成装置
110 不良品データ
111、211データ組み合わせ装置
112、212、163、263 演算装置
113、213、162、262 メモリ
120、220 教師物体
130、230 データ抽出部
140、240 データ拡張部
150、250 撮影部
160、260 教師物体データ
170、270 不良局所形状の位置情報データ
171、271 第1のデータ格納部
180 不良局所形状データ
181、281 第2のデータ格納部
190、210、290 良品データ
191、291 第3のデータ格納部
280 良品局所形状データ
300、400 追加学習要否装置
301、401 溶接装置
302、402 コントローラ
501 成形装置
B1、B1a、B1b ビード
B11 第1板
B12、B12a、B12b 第2板
B2 アンダーカット
B3 ピット
B4 スパッタ
C1 突起
D1 ピット
D2、D2A、D2B 凹部
E1 ホール
F2、F2A、F2B 虚像
R1、R1A、R1B、R1C、R1D、R1E、R1F 領域 100, 200
Claims (20)
- 不良局所形状を有する物体のデータから前記不良局所形状に関する第1のデータを抽出するデータ抽出部と、
前記第1のデータと、前記不良局所形状を有さない物体の第2のデータとを組み合わせるデータ拡張部と、を備え、
前記データ抽出部は、
前記不良局所形状を有する物体のデータと前記不良局所形状の位置情報データとに基づいて、前記第1のデータを抽出する、
データ生成システム。 A data extraction unit that extracts the first data related to the defective local shape from the data of the object having the defective local shape, and
A data extension unit that combines the first data and the second data of an object having no defective local shape is provided.
The data extraction unit
The first data is extracted based on the data of the object having the defective local shape and the position information data of the defective local shape.
Data generation system. - 前記物体は、複数の前記不良局所形状を有する、
請求項1に記載のデータ生成システム。 The object has a plurality of the defective local shapes.
The data generation system according to claim 1. - 前記物体は、複数の素材で形成されている、
請求項1または2に記載のデータ生成システム。 The object is made of a plurality of materials,
The data generation system according to claim 1 or 2. - 前記データ抽出部は、前記物体を撮影する撮影部を備える、
請求項1から3のいずれか1項に記載のデータ生成システム。 The data extraction unit includes a photographing unit for photographing the object.
The data generation system according to any one of claims 1 to 3. - 前記撮影部は、前記物体を複数の撮影条件で撮影する、
請求項4に記載のデータ生成システム。 The photographing unit photographs the object under a plurality of photographing conditions.
The data generation system according to claim 4. - 前記第1のデータおよび前記第2のデータは、画像情報である、
請求項1から5のいずれか1項に記載のデータ生成システム。 The first data and the second data are image information.
The data generation system according to any one of claims 1 to 5. - 前記第1のデータと前記第2のデータの組み合わせは、複数のパラメータに基づくデータ拡張によって行われる、
請求項1から6のいずれか1項に記載のデータ生成システム。 The combination of the first data and the second data is performed by data expansion based on a plurality of parameters.
The data generation system according to any one of claims 1 to 6. - 不良局所形状を有する物体のデータから前記不良局所形状に関する第1のデータを抽出するステップと、
前記第1のデータと、前記不良局所形状を有さない物体の第2のデータとを組み合わせるステップと、有し、
前記抽出するステップは、
前記不良局所形状を有する物体のデータと前記不良局所形状の位置情報データとに基づいて、前記第1のデータを抽出するステップを含む、
データ生成方法。 A step of extracting the first data regarding the defective local shape from the data of the object having the defective local shape, and
It has a step of combining the first data and the second data of the object having no defective local shape.
The extraction step
A step of extracting the first data based on the data of the object having the defective local shape and the position information data of the defective local shape is included.
Data generation method. - 前記物体は、複数の前記不良局所形状を有する、
請求項8に記載のデータ生成方法。 The object has a plurality of the defective local shapes.
The data generation method according to claim 8. - 前記物体は、複数の素材で形成されている、
請求項8または9に記載のデータ生成方法。 The object is made of a plurality of materials,
The data generation method according to claim 8 or 9. - 前記データ抽出部は、前記物体を撮影する撮影部を備える、
請求項8から10のいずれか1項に記載のデータ生成方法。 The data extraction unit includes a photographing unit for photographing the object.
The data generation method according to any one of claims 8 to 10. - 前記撮影部は、前記物体を複数の撮影条件で撮影する、
請求項11に記載のデータ生成方法。 The photographing unit photographs the object under a plurality of photographing conditions.
The data generation method according to claim 11. - 前記第1のデータおよび前記第2のデータは、画像情報である、
請求項8から12のいずれか1項に記載のデータ生成方法。 The first data and the second data are image information.
The data generation method according to any one of claims 8 to 12. - 前記第1のデータと前記第2のデータの組み合わせは、複数のパラメータに基づくデータ拡張によって行われる、
請求項8から13のいずれか1項に記載のデータ生成方法。 The combination of the first data and the second data is performed by data expansion based on a plurality of parameters.
The data generation method according to any one of claims 8 to 13. - 物体の局所的特徴に割り当てられたクラスを判定するための学習用データを生成する方法であって、
前記クラスの局所特徴を有するサンプル物体と、前記物体における前記局所特徴が含まれる位置と範囲を示す位置情報とにより、前記サンプル物体を撮影して得られた前記局所特徴を用いて前記学習用データを生成する、データ生成方法。 A method of generating training data to determine the class assigned to a local feature of an object.
The learning data using the local features obtained by photographing the sample object by the sample object having the local features of the class and the position information indicating the position and range including the local features in the object. How to generate data. - 不良局所形状を有する物体から第1データを抽出する抽出部と、
前記第1データと、前記不良局所形状を有さない物体の第2データとを組み合わせるデータ拡張部と、を有し
前記抽出部は、前記物体から前記不良局所形状に関する複数のデータを抽出し、この複数のデータを融合して第1データを生成する生成部を含み、
前記データ拡張部は、前記第1データと、前記不良局所形状を有さない物体の第2データとを組み合わせる組み合わせ部を含む、
データ生成装置。 An extraction unit that extracts the first data from an object with a defective local shape,
The extraction unit has a data extension unit that combines the first data and the second data of an object that does not have the defective local shape, and the extraction unit extracts a plurality of data related to the defective local shape from the object. It includes a generator that fuses these multiple data to generate the first data.
The data expansion unit includes a combination unit that combines the first data and the second data of an object having no defective local shape.
Data generator. - 不良局所形状を有する物体から第1データを抽出する抽出部と、
前記第1データと、前記不良局所形状を有さない物体の第2データとを組み合わせるデータ拡張部と、を有し
前記抽出部は、複数の撮影条件で撮影したデータを融合し前記第1データを生成する生成部を含み、
前記データ拡張部は、前記第1データと、前記不良局所形状を有さない物体の第2データとを組み合わせる組み合わせ部を含む、
データ生成装置。 An extraction unit that extracts the first data from an object with a defective local shape,
The extraction unit has a data expansion unit that combines the first data and the second data of an object that does not have a defective local shape, and the extraction unit fuses data captured under a plurality of imaging conditions to obtain the first data. Including the generator that generates
The data expansion unit includes a combination unit that combines the first data and the second data of an object having no defective local shape.
Data generator. - 請求項16または17に記載のデータ生成装置によって生成したデータについて推論を行い、正しく推論結果が得られなかった場合に追加学習を行う判定をする、追加学習要否装置。 An additional learning necessity device that makes an inference about the data generated by the data generation device according to claim 16 or 17, and determines that additional learning is performed when the inference result is not correctly obtained.
- 少数のデータを作成し、作成したデータに対する推論の精度が所定の値よりも低い場合に、より多数の教師データを生成するデータ拡張を行って、追加学習を行う、請求項18記載の追加学習要否装置。 The additional learning according to claim 18, wherein a small amount of data is created, and when the accuracy of inference for the created data is lower than a predetermined value, data expansion is performed to generate a larger amount of teacher data, and additional learning is performed. Necessity device.
- 請求項16または17に記載のデータ生成装置を備え、かつ
対象物のデータを得るステップと、
前記第1データを得るステップと、
前記対象物のデータと前記第1データとを比較し、良・不良の推論を行うステップと、
前記良・不良の推論について良好か不良かを判断するステップと、
前記良好か不良かの結果により、さらなる前記第1データの記憶および学習をするかどうかの判断をし、良好な結果の場合追加学習を中止するステップと、
前記結果が不良の場合、前記データ生成装置に対し前記第1データを新たに生成するよう指示し、かつAIモデルの更新または新たなAIモデルの構築を行うステップと、
を有する、
追加学習装置。 The step of providing the data generator according to claim 16 or 17 and obtaining the data of the object,
The step of obtaining the first data and
A step of comparing the data of the object with the first data and inferring good / bad, and
The step of judging whether the inference of good or bad is good or bad, and
Based on the good or bad result, it is determined whether or not to further store and learn the first data, and if the result is good, the additional learning is stopped.
If the result is poor, the step of instructing the data generator to newly generate the first data, and updating the AI model or constructing a new AI model.
Have,
Additional learning device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020055241 | 2020-03-26 | ||
JP2020-055241 | 2020-03-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021193347A1 true WO2021193347A1 (en) | 2021-09-30 |
Family
ID=77892552
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/011062 WO2021193347A1 (en) | 2020-03-26 | 2021-03-18 | Data generation system, data generation method, data generation device, and additional learning requirement assessment device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021193347A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2023190045A1 (en) * | 2022-03-29 | 2023-10-05 | ||
WO2023190044A1 (en) * | 2022-03-29 | 2023-10-05 | パナソニックIpマネジメント株式会社 | Image generation system, image generation method, and program |
JPWO2023190046A1 (en) * | 2022-03-29 | 2023-10-05 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005156334A (en) * | 2003-11-25 | 2005-06-16 | Nec Tohoku Sangyo System Kk | Pseudo defective image automatic creation device and imaging inspection device |
JP2011214903A (en) * | 2010-03-31 | 2011-10-27 | Denso It Laboratory Inc | Appearance inspection apparatus, and apparatus, method and program for generating appearance inspection discriminator |
US20160328837A1 (en) * | 2015-05-08 | 2016-11-10 | Kla-Tencor Corporation | Method and System for Defect Classification |
JP2019109563A (en) * | 2017-12-15 | 2019-07-04 | オムロン株式会社 | Data generation device, data generation method, and data generation program |
-
2021
- 2021-03-18 WO PCT/JP2021/011062 patent/WO2021193347A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005156334A (en) * | 2003-11-25 | 2005-06-16 | Nec Tohoku Sangyo System Kk | Pseudo defective image automatic creation device and imaging inspection device |
JP2011214903A (en) * | 2010-03-31 | 2011-10-27 | Denso It Laboratory Inc | Appearance inspection apparatus, and apparatus, method and program for generating appearance inspection discriminator |
US20160328837A1 (en) * | 2015-05-08 | 2016-11-10 | Kla-Tencor Corporation | Method and System for Defect Classification |
JP2019109563A (en) * | 2017-12-15 | 2019-07-04 | オムロン株式会社 | Data generation device, data generation method, and data generation program |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2023190045A1 (en) * | 2022-03-29 | 2023-10-05 | ||
WO2023190044A1 (en) * | 2022-03-29 | 2023-10-05 | パナソニックIpマネジメント株式会社 | Image generation system, image generation method, and program |
WO2023190045A1 (en) * | 2022-03-29 | 2023-10-05 | パナソニックIpマネジメント株式会社 | Image generation system, image generation method, and program |
JPWO2023190046A1 (en) * | 2022-03-29 | 2023-10-05 | ||
WO2023190046A1 (en) * | 2022-03-29 | 2023-10-05 | パナソニックIpマネジメント株式会社 | Data creation system, data creation method, and program |
JPWO2023190044A1 (en) * | 2022-03-29 | 2023-10-05 | ||
JP7702659B2 (en) | 2022-03-29 | 2025-07-04 | パナソニックIpマネジメント株式会社 | Image generation system, image generation method, and program |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021193347A1 (en) | Data generation system, data generation method, data generation device, and additional learning requirement assessment device | |
Siu et al. | A framework for synthetic image generation and augmentation for improving automatic sewer pipe defect detection | |
Loverdos et al. | Automatic image-based brick segmentation and crack detection of masonry walls using machine learning | |
Huynh | Vision-based autonomous bolt-looseness detection method for splice connections: Design, lab-scale evaluation, and field application | |
US11915408B2 (en) | Methods of artificial intelligence-assisted infrastructure assessment using mixed reality systems | |
Gobert et al. | Application of supervised machine learning for defect detection during metallic powder bed fusion additive manufacturing using high resolution imaging. | |
US20210201472A1 (en) | Method of inspecting and evaluating coating state of steel structure and system therefor | |
US10262236B2 (en) | Neural network training image generation system | |
JP7004145B2 (en) | Defect inspection equipment, defect inspection methods, and their programs | |
TW202240546A (en) | Image augmentation techniques for automated visual inspection | |
JP2021131853A (en) | Change detection method and system using AR overlay | |
Karaaslan et al. | Mixed reality-assisted smart bridge inspection for future smart cities | |
CN111523411B (en) | Synthetic aperture imaging method based on semantic patching | |
Tatzel et al. | Image-based modelling and visualisation of the relationship between laser-cut edge and process parameters | |
Patel et al. | Using machine learning to analyze image data from advanced manufacturing processes | |
Baek et al. | Data augmentation using adversarial training for construction-equipment classification | |
Pham et al. | Automatic welding seam tracking and real-world coordinates identification with machine learning method | |
Nielsen et al. | Automatic melt pool recognition in X-ray radiography images from laser-molten Al alloy | |
CN112651341A (en) | Processing method of welded pipe weld joint real-time detection video | |
DE102018218611B4 (en) | Method and computing device for generating a three-dimensional CAD model | |
CN115841546A (en) | Scene structure associated subway station multi-view vector simulation rendering method and system | |
JP7702659B2 (en) | Image generation system, image generation method, and program | |
CN116894808A (en) | Teacher data generation method and generation device | |
Gobert | Online discontinuity detection in metallic powder bed fusion additive manufacturing processes using visual inspection sensors and supervised machine learning | |
Jani | Deep Learning-Based Spatter Detection and Weld Pool Segmentation for Automated Welding Quality Assessment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21775295 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21775295 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |