[go: up one dir, main page]

CN112833806A - Needle inspection device - Google Patents

Needle inspection device Download PDF

Info

Publication number
CN112833806A
CN112833806A CN202011299708.3A CN202011299708A CN112833806A CN 112833806 A CN112833806 A CN 112833806A CN 202011299708 A CN202011299708 A CN 202011299708A CN 112833806 A CN112833806 A CN 112833806A
Authority
CN
China
Prior art keywords
needle
image
imaging device
inspection
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011299708.3A
Other languages
Chinese (zh)
Other versions
CN112833806B (en
Inventor
藤江公子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juki Corp
Original Assignee
Juki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki Corp filed Critical Juki Corp
Publication of CN112833806A publication Critical patent/CN112833806A/en
Application granted granted Critical
Publication of CN112833806B publication Critical patent/CN112833806B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/16Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B81/00Sewing machines incorporating devices serving purposes other than sewing, e.g. for blowing air, for grinding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

技术问题:高精度地检查缝针。解决方案:缝针检查装置具备:第一摄像装置,其拍摄保持于缝纫机的针棒的缝针;第二摄像装置,其拍摄缝针;以及处理装置,其基于由第一摄像装置获得的第一图像及由第二摄像装置获得的第二图像来输出缝针的检查数据。第一摄像装置与第二摄像装置配置为表示第一摄像装置的光轴的第一光轴与表示第二摄像装置的光轴的第二光轴在缝针上正交。

Figure 202011299708

Technical problem: Check the stitches with high precision. Solution: The needle inspection device includes: a first camera that captures a needle held by a needle bar of a sewing machine; a second camera that captures a needle; and a processing device based on the first camera obtained by the first camera An image and a second image obtained by the second camera are used to output inspection data of the sewing needle. The first imaging device and the second imaging device are arranged such that a first optical axis representing the optical axis of the first imaging device and a second optical axis representing the optical axis of the second imaging device are orthogonal on the sewing needle.

Figure 202011299708

Description

Sewing needle inspection device
Technical Field
The present disclosure relates to a needle inspection device.
Background
In the technical field of sewing machines, a sewing machine having an imaging unit as disclosed in patent document 1 is known.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2009-189551
Disclosure of Invention
Technical problem to be solved
There are cases where images of a needle are obtained with a plurality of imaging devices, respectively, and the needle is inspected based on the plurality of images. It may be difficult to inspect the stitch with high accuracy due to the relative positions of the plurality of imaging devices.
The purpose of the present disclosure is to inspect a sewing needle with high accuracy.
(II) technical scheme
According to the present disclosure, there is provided a needle inspection device including: a first imaging device for imaging a needle held by a needle bar of a sewing machine; a second imaging device that images the needle; and a processing device that outputs inspection data of the needle based on a first image obtained by the first imaging device and a second image obtained by the second imaging device, the first imaging device and the second imaging device being arranged such that a first optical axis representing an optical axis of the first imaging device and a second optical axis representing an optical axis of the second imaging device are orthogonal on the needle.
(III) advantageous effects
According to the present disclosure, a sewing needle can be inspected with high accuracy.
Drawings
Fig. 1 is a perspective view schematically showing a sewing machine according to an embodiment.
Fig. 2 is a plan view schematically showing the needle inspection device according to the embodiment.
Fig. 3 is a diagram for explaining a camera coordinate system of the embodiment.
Fig. 4 is a block diagram showing a needle inspection device according to an embodiment.
Fig. 5 is a flowchart showing a needle inspection method according to an embodiment.
Fig. 6 is a diagram showing a calibration jig according to an embodiment.
Fig. 7 is a diagram showing a state of use of the calibration jig according to the embodiment.
Fig. 8 is a diagram showing an example of a first image of the calibration jig obtained by the first imaging device of the embodiment.
Fig. 9 is a diagram showing an example of a second image of the calibration jig obtained by the second imaging device of the embodiment.
Fig. 10 is a diagram for explaining a reference axis measuring tool and a method of calculating a needle reference axis according to the embodiment.
Fig. 11 is a flowchart showing a calibration process according to the embodiment.
Fig. 12 is a flowchart showing focus adjustment processing according to the embodiment.
Fig. 13 is a diagram for explaining a method of calculating the edge sharpness of the grid according to the embodiment.
Fig. 14 is a flowchart showing a pixel rate calculation process according to the embodiment.
Fig. 15 is a diagram for explaining the pixel rate calculation processing according to the embodiment.
Fig. 16 is a diagram for explaining the relative camera angle calculation processing according to the embodiment.
Fig. 17 is a diagram showing an example of first image data of the calibration jig obtained by the first imaging device in the camera angle calculation process according to the embodiment.
Fig. 18 is a diagram for explaining the distance Vsz and the distance Ksz according to the embodiment.
Fig. 19 is a flowchart showing a needle inspection process according to the embodiment.
Fig. 20 is a flowchart showing a pin registration process according to the embodiment.
Fig. 21 is a diagram for explaining reference feature values of the sewing needle according to the embodiment.
Fig. 22 is a flowchart showing a needle inspection process according to the embodiment.
Fig. 23 is a diagram for explaining detection characteristics of the sewing needle according to the embodiment.
Fig. 24 is a diagram showing an example of a first image of the needle according to the embodiment.
Fig. 25 is a diagram illustrating an example of a method of processing a first image according to the embodiment.
Fig. 26 is a diagram illustrating an example of a method of processing a first image according to the embodiment.
Fig. 27 is a diagram showing an example of a method of calculating the deformation amount of the needle subjected to bending deformation according to the embodiment.
Description of the reference numerals
1-a sewing machine; 2-sewing machine head; 3-sewing the needle; 3R-reference stitch; 3S, detecting a sewing needle; 4-needle bar; 5-needle plate; 6-a presser foot component; 7-pinhole; 10-a needle inspection device; 11-a first camera device; 12-a second camera; 13-a processing device; 13A-an image obtaining section; 13B-a calibration section; 13C-calibration storage; 13D-reference feature value calculating unit; 13E-reference feature storage unit; 13F-detection feature amount calculating section; 13G-inspection part; 14-an output device; 20-a calibration tool; 21-plate; 22-grid; 22A-uppermost grid line; 22B-lowermost grid line; 23-determination range; 24-simulated grid lines; 30-a reference axis measurement tool; AX 1-first optical axis; AX 2-second optical axis; AR-minimum width; AS-minimum width; BR maximum width; BS-maximum width; CR-tip angle; CS-needle tip angle; DS-deflection; EA-discontinuous edge zone; FA-visual field; ID 1-first image; ID 2-second image; ksz-distance; LR-length; LS-length; lv-imaginary vertical field of view; NP-needle position; an NX-needle reference axis; NXo-center; p1-intersection; p2-intersection; pl-point; pr-point; PS-sewing position; rv-hypothetical vertical field of view; s-sewing the object; SE-stitch; vsz-distance; theta-angle.
Detailed Description
Embodiments of the present disclosure will be described below with reference to the drawings, but the present disclosure is not limited thereto. The components of the embodiments described below can be combined as appropriate. In addition, some of the components may not be used.
In the embodiment, the positional relationship of each part will be described based on a sewing machine coordinate system defined in the sewing machine 1. The sewing machine coordinate system is defined by a three-dimensional orthogonal coordinate system. The sewing machine coordinate system is defined by an Xm axis, a Ym axis, and a Zm axis. The Xm axis is defined in the defined plane. The Ym axis is defined as being orthogonal to the Xm axis in a predetermined plane. The Zm axis is defined to be orthogonal to the predetermined plane. In the embodiment, the predetermined plane is parallel to the horizontal plane, and the direction parallel to the Zm axis is the vertical direction. In the embodiment, the predetermined plane is appropriately referred to as an XmYm plane.
[ Sewing machine ]
Fig. 1 is a perspective view schematically showing a sewing machine 1 according to an embodiment. As shown in fig. 1, a sewing machine 1 includes: a sewing machine head 2, a needle bar 4 holding a sewing needle 3 and reciprocating along a Z-axis direction, a needle plate 5 supporting a sewing object S, a presser foot member 6 pressing the sewing object S, and a sewing needle inspection device 10 inspecting the sewing needle 3.
The sewing head 2 supports the needle bar 4 so as to be capable of reciprocating in the Zm axis direction. The needle bar 4 is disposed above the needle plate 5 and can be opposed to the surface of the sewing object S. The needle bar 4 holds the upper end of the needle 3 so that the needle 3 extends in the Zm axis direction. An upper thread is hung on the sewing needle 3.
The needle plate 5 supports the back surface of the sewing object S. The upper surface of the needle plate 5 is parallel to the XmYm plane. The needle plate 5 supports the sewing object S from below. The needle plate 5 has a needle hole 7 through which the sewing needle 3 can pass. A pot (not shown) is disposed below the needle plate 5. The kettle holds the spool accommodated in the spool box. A lower wire is wound on the spool. The kettle is rotated in synchronization with the reciprocating movement of the needle bar 4. The feed was taken off line from the kettle.
The presser foot member 6 presses the sewing object S from above. The presser foot member 6 is supported by the sewing machine head 2. The presser foot member 6 is disposed above the needle plate 5 and contacts the surface of the sewing object S. The presser foot member 6 holds the sewing object S between the presser foot member and the needle plate 5.
When the needle bar 4 is lowered, the sewing needle 3 held by the needle bar 4 penetrates the sewing object S and passes through a needle hole 7 provided in the needle plate 5. When the sewing needle 3 passes through the needle hole 7 of the needle plate 5, the lower thread supplied from the pot is caught on the upper thread caught by the sewing needle 3. In a state where the lower thread is hung on the upper thread, the sewing needle 3 is lifted up and retreated from the sewing object S. When the sewing needle 3 penetrates the sewing object S, the sewing object S stops. When the sewing needle 3 retreats from the sewing object S, the sewing object S moves in the + Ym direction by the feed mechanism of the sewing machine 1. The sewing machine 1 reciprocates the sewing needle 3 while repeating the movement and stop of the sewing object S in the + Ym direction, thereby forming a stitch SE on the sewing object S. The stitch SE formed on the sewing object S extends in the Ym axis direction.
In the following description, a position directly below the sewing needle 3 is appropriately referred to as a sewing position PS. In the XmYm plane, the sewing position PS coincides with the position of the sewing needle 3. At the sewing position PS, the sewing needle 3 penetrates the sewing object S.
[ Sewing needle checking device ]
Fig. 2 is a plan view schematically showing the needle inspection device 10 according to the embodiment. As shown in fig. 1 and 2, the needle inspection device 10 includes: a first imaging device 11, a second imaging device 12, a processing device 13, and an output device 14.
The needle 3 is held by the needle bar 4 so as to extend in the Zm axis direction. The first imaging device 11 and the second imaging device 12 respectively image the sewing needle 3 held by the needle bar 4. The first imaging device 11 and the second imaging device 12 image the needle 3 in a state where the needle bar 4 and the needle 3 are stationary, respectively.
The first camera 11 and the second camera 12 can simultaneously take images of the needle 3. The first imaging device 11 may capture the needle 3 before the second imaging device 12 captures the needle 3. The first imaging device 11 may image the needle 3 after the second imaging device 12 images the needle 3.
The first imaging device 11 images the needle 3 and obtains a first image ID1 of the needle 3. The first imaging device 11 outputs the first image ID1 of the needle 3 to the processing device 13.
The second imaging device 12 images the needle 3 and obtains a second image ID2 of the needle 3. The second imaging device 12 outputs the second image ID2 of the needle 3 to the processing device 13.
The first imaging device 11 has an optical system and an image sensor that receives light via the optical system. As the image sensor, a CCD (coupled charge Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor is exemplified.
The second imaging device 12 has an optical system and an image sensor that receives light via the optical system. Examples of the image sensor include a ccd (charge device) image sensor and a cmos (complementary Metal Oxide semiconductor) image sensor.
In the following description, the optical axis of the optical system of the first imaging device 11 is appropriately referred to as a first optical axis AX1, and the optical axis of the optical system of the second imaging device 12 is appropriately referred to as a second optical axis AX 2. The first optical axis AX1 denotes the optical axis of the first image pickup device 11. The second optical axis AX2 represents the optical axis of the second image pickup device 12.
The first imaging device 11 and the second imaging device 12 are disposed at different positions on the XmYm plane. The first imaging device 11 and the second imaging device 12 are disposed such that the first optical axis AX1 is orthogonal to the second optical axis AX2 on the needle 3. The first optical axis AX1 intersects at least a portion of the needle 3. The second optical axis AX2 intersects at least a portion of the needle 3.
The first image pickup device 11 is disposed such that the first optical axis AX1 is parallel to the XmYm plane. The second image pickup device 12 is disposed such that the second optical axis AX2 is parallel to the XmYm plane. The first optical axis AX1 and the second optical axis AX2 are parallel to the XmYm plane, respectively.
In the Zm-axis direction, the position of the first optical axis AX1 substantially coincides with the position of the second optical axis AX 2. In the Zm axis direction, the position of the first optical axis AX1, the position of the second optical axis AX2, and the position of at least a part of the needle 3 held by the needle bar 4 coincide with each other.
That is, the first imaging device 11 and the second imaging device 12 each image the needle 3 from the side in a state of being arranged at substantially the same height as the needle 3. The first imaging device 11 and the second imaging device 12 capture images of the needle 3 from different directions in the XmYm plane.
The processing device 13 comprises a computer system. The processing device 13 outputs the inspection data of the needle 3 to the output device 14 based on the first image ID1 of the needle 3 obtained by the first imaging device 11 and the second image ID2 of the needle 3 obtained by the second imaging device 12.
The output device 14 outputs the inspection data of the sewing needle 3 supplied from the processing device 13. Examples of the output device 14 include a display device such as a flat panel display and a printing device such as an inkjet printer. In the embodiment, it is assumed that the output device 14 is a display device.
[ Camera coordinate System ]
Fig. 3 is a diagram for explaining a camera coordinate system of the embodiment. The structure and optical characteristics of the first imaging device 11 are the same as those of the second imaging device 12. A camera coordinate system is defined in each of the first imaging device 11 and the second imaging device 12. The camera coordinate system is defined by a three-dimensional orthogonal coordinate system. The camera coordinate system is defined by the Xc axis, Yc axis, and Zc axis. The Xc axis is defined in a predetermined plane parallel to an imaging plane (incidence plane) of the image sensor. The Yc axis is defined as being orthogonal to the Xc axis in a predetermined plane. The Zc axis is defined to be orthogonal to the predetermined plane.
In an embodiment, the Xc axis is parallel to the Zm axis. The Yc axis is parallel to the XmYm plane. The fields of view FA of the optical systems of the first imaging device 11 and the second imaging device 12 are rectangular. The fields of view of the optical systems of the first imaging device 11 and the second imaging device 12 are 10.0[ mm ] in the Xc axis direction and 7.5[ mm ] in the Yc axis direction. Each of the image sensors of the first imaging device 11 and the second imaging device 12 has 640 pixels.
[ treatment apparatus ]
Fig. 4 is a block diagram showing the needle inspection device 10 according to the embodiment. As shown in fig. 4, the needle inspection device 10 includes: a first imaging device 11, a second imaging device 12, a processing device 13, and an output device 14.
The processing device 13 performs image processing on the first image ID1 of the needle 3 obtained by the first imaging device 11 and the second image ID2 of the needle 3 obtained by the second imaging device 12, and outputs inspection data indicating the inspection result of the needle 3. The inspection data includes an abnormality of the seamless needle 3.
The abnormality of the needle 3 includes bending deformation of the needle 3. In the embodiment, the processing device 13 performs image processing on the first image ID1 and the second image ID2, and determines whether or not there is bending deformation of the suture needle 3. The processing device 13 performs image processing on the first image ID1 and the second image ID2, and calculates a deformation amount DS of the needle 3 which is bent and deformed. The inspection data includes a deformation amount DS of the needle 3 which is bent and deformed.
As shown in fig. 4, the processing device 13 includes: an image obtaining unit 13A, a calibration unit 13B, a calibration storage unit 13C, a reference feature amount calculation unit 13D, a reference feature amount storage unit 13E, a detection feature amount calculation unit 13F, and an inspection unit 13G.
The image obtaining unit 13A obtains the first image ID1 of the needle 3 from the first imaging device 11. The image obtaining unit 13A obtains the second image ID2 of the needle 3 from the second imaging device 12.
The calibration unit 13B performs calculation processing for calibrating the first imaging device 11 and the second imaging device 12, respectively, in the calibration processing SA for the first imaging device 11 and the second imaging device 12.
The calibration storage unit 13C stores data relating to calibration obtained in the calibration process SA.
The reference feature amount calculation unit 13D calculates a first reference feature amount based on the first image ID1 of the needle 3 held by the needle bar 4. The reference feature amount calculation unit 13D calculates a second reference feature amount based on the second image ID2 of the needle 3 held by the needle bar 4.
The reference feature storage unit 13E stores the first reference feature and the second reference feature of the sewing needle 3. In the embodiment, the reference feature storage unit 13E stores the first reference feature and the second reference feature calculated by the reference feature calculation unit 13D.
The detection feature amount calculation unit 13F performs image processing on the first image ID1 of the needle 3 obtained by the image obtaining unit 13A to calculate a first detection feature amount of the needle 3. The detection feature amount calculation unit 13F performs image processing on the second image ID2 of the needle 3 obtained by the image obtaining unit 13A to calculate a second detection feature amount of the needle 3.
In the embodiment, the needle 3 includes a reference needle 3R and a detection needle 3S as an inspection target.
The reference needle 3R is a normal needle 3. That is, the reference needle 3R is the needle 3 without abnormality. As a reference needle 3R, a new needle 3 is exemplified.
The detection needle 3S is the needle 3 used for the sewing process. The detection needle 3S is a needle 3 which may have an abnormality. As the detection needle 3S, a reference needle 3R after use is exemplified.
In the embodiment, the reference feature amount calculation unit 13D calculates the first reference feature amount of the reference needle 3R based on the first image ID1 of the reference needle 3R held by the needle bar 4. The reference feature amount calculation unit 13D calculates a second reference feature amount of the reference stitch 3R based on the second image ID2 of the reference stitch 3R.
The detection feature amount calculation unit 13F performs image processing on the first image ID1 of the detection needle 3S held by the needle bar 4 to calculate a first detection feature amount of the detection needle 3S. The detection feature amount calculation unit 13F performs image processing on the second image ID2 of the detection needle 3S held by the needle bar 4 to calculate a second detection feature amount of the detection needle 3S.
The inspection unit 13G compares the first detection feature amount with the first reference feature amount, and compares the second detection feature amount with the second reference feature amount, and outputs inspection data indicating an inspection result of the needle 3 (the detected needle 3S).
[ Sewing needle inspection method ]
Fig. 5 is a flowchart showing a needle inspection method according to an embodiment. As shown in fig. 5, the needle checking method includes a calibration process SA and a needle checking process SB.
< calibration processing >
The calibration process SA will be explained. The calibration process SA is a process of calibrating the first imaging device 11 and the second imaging device 12, respectively.
In the calibration process, the calibration tool 20 and the reference axis measurement tool 30 are used.
(calibration jig)
Fig. 6 is a diagram showing an example of the calibration jig 20 according to the embodiment. As shown in fig. 6, the calibration jig 20 includes a plate 21 and a pattern of a lattice 22 formed on the plate 21. The plate 21 is made of glass. The plate 21 is a parallel flat plate. That is, the front and back surfaces of the plate 21 are flat, and the front and back surfaces of the plate 21 are parallel. The plate 21 is square in shape. One side of the plate 21 is 30 mm in length. The thickness of the plate 21 is 3 mm. The surface of the plate 21 is white. The lattice 22 is black. Black lattices 22 are provided on the surface of the white plate 21. The interval of the lattice 22 is 1[ mm ], and the line width of the lattice 22 is 100[ mu ] m.
Fig. 7 is a diagram showing a state of use of the calibration jig 20 according to the embodiment. A needle position NP is specified in a sewing machine 1. The needle position NP is a position in which the sewing needle 3 is provided in the XmYm plane. The needle position NP indicates a target position of the sewing needle 3. The needle position NP is the center of the needle bar 4 in the XmYm plane. The needle position NP can be specified in the sewing machine coordinate system based on, for example, design data of the sewing machine 1.
In the calibration process SA, the calibration jig 20 is set in the sewing machine 1 such that the surface of the calibration jig 20 is orthogonal to the XmYm plane. The calibration jig 20 is provided to the sewing machine 1 such that the surface of the calibration jig 20 coincides with the needle position NP. The first camera 11 is disposed such that the first optical axis AX1 intersects the surface of the calibration tool 20 at an angle of 45 °. The second camera 12 is disposed such that the second optical axis AX2 intersects the surface of the calibration tool 20 at an angle of 45 °. The first image pickup device 11 and the second image pickup device 12 are arranged such that the first optical axis AX1 intersects the second optical axis AX2 at an angle of 90[ ° ]. The first imaging device 11 and the second imaging device 12 respectively image the calibration jig 20.
Fig. 8 is a diagram showing an example of the first image ID1 of the calibration jig 20 obtained by the first imaging device 11 according to the embodiment. Fig. 9 is a diagram showing an example of the second image ID2 of the calibration jig 20 obtained by the second imaging device 12 according to the embodiment.
(reference axis measuring jig)
Fig. 10 is a diagram for explaining a method of calculating the reference axis measuring tool 30 and the needle reference axis NX according to the embodiment. As shown in fig. 10, the reference axis measuring tool 30 is held by the needle bar 4. The reference axis measuring tool 30 is, for example, a straight metal rod. The first imaging device 11 and the second imaging device 12 each image the reference axis measuring tool 30 held by the needle bar 4.
The first image ID1 of the reference axis measuring tool 30 obtained by the first imaging device 11 and the second image ID2 of the reference axis measuring tool 30 obtained by the second imaging device 12 are obtained by the image obtaining section 13A. The calibration unit 13B calculates the needle reference axis NX based on the first image ID1 of the reference axis measuring tool 30 and the second image ID2 of the reference axis measuring tool 30.
The needle reference axis NX is an axis connecting the center of the needle bar 4 and the center of the needle hole 7. The needle reference axis NX in the first image ID1 and the needle reference axis NX in the second image ID2 are calculated by obtaining the first image ID1 and the second image ID2 of the reference axis measuring tool 30, respectively.
The position of the needle reference axis NX in the first image ID1 and the position of the needle reference axis NX in the second image ID2 are stored in the calibration storage unit 13C, respectively. The position of the needle reference axis NX in the first image ID1 and the position of the needle reference axis NX in the second image ID2 are the positions of the needle reference axis NX in the camera coordinate system, respectively.
Fig. 10 shows an example of the first image ID1 of the reference axis measuring tool 30 obtained by the first imaging device 11 and an example of the second image ID2 of the reference axis measuring tool 30 obtained by the second imaging device 12.
Fig. 11 is a flowchart showing calibration processing SA according to the embodiment. As shown in fig. 11, the calibration process SA includes a focus adjustment process SA1, a pixel rate calculation process SA2, and a relative camera angle calculation process SA 3.
(Focus adjustment)
The focus adjustment processing SA1 of each of the first image pickup device 11 and the second image pickup device 12 is performed. As described with reference to fig. 7, in the focus adjustment process SA1, the calibration jig 20 is set on the sewing machine 1.
Fig. 12 is a flowchart showing focus adjustment processing SA1 according to the embodiment. The focus adjustment processing SA1 of the first imaging device 11 will be described below.
When the focus adjustment of the first imaging device 11 is performed, the calibration unit 13B sets the focus of the optical system of the first imaging device 11 (step SA 11).
After the focus is set, the calibration unit 13B images the calibration jig 20 using the first imaging device 11 (step SA 12).
The first image ID1 of the calibration jig 20 is obtained by the image obtaining section 13A. After obtaining the first image ID1 of the calibration jig 20, the calibration section 13B calculates the sharpness of the edge of the lattice 22 in the first image ID1 (step SA 13).
Fig. 13 is a diagram for explaining a method of calculating the sharpness of the edge of the grid 22 according to the embodiment. As shown in fig. 13, the calibration unit 13B sets a part of the first image ID1 as the determination range 23. The calibration unit 13B calculates the sharpness of the edge of the lattice 22 in the determination range 23. The calibration section 13B calculates the sharpness based on, for example, the sum of squares of density differences of adjacent pixels.
The resolution calculated in step SA13 is output to the calibration storage unit 13C. In the calibration storage section 13C, the inputted definition is compared with the already stored definition, and the higher definition is stored in the calibration storage section 13C (step SA 14).
The processing from step SA11 to step SA14 is repeated. If the maximum sharpness in the calibration storage unit 13C is not updated a predetermined number of times even if the focus is changed, the process ends.
The focus adjustment processing SA1 of the second image pickup apparatus 12 is the same as the focus adjustment processing SA1 of the first image pickup apparatus 11. Description of the focus adjustment processing SA1 of the second image pickup apparatus 12 is omitted.
(pixel Rate calculation)
After the focus adjustment processing SA1 ends, the pixel ratio calculation processing SA2 is performed for each of the first imaging device 11 and the second imaging device 12. Fig. 14 is a flowchart showing pixel rate calculation processing SA2 according to the embodiment. The pixel ratio calculation processing SA2 of the first imaging device 11 will be described below.
When calculating the pixel rate of the first imaging device 11, the calibration unit 13B calculates the needle reference axis NX as described with reference to fig. 10 (step SA 21).
Next, the calibration unit 13B calculates the pseudo grid line 24 passing through the center of the needle reference axis NX (step SA 22).
Fig. 15 is a diagram for explaining pixel rate calculation processing SA2 according to the embodiment. As shown in fig. 15, the calibration unit 13B calculates the simulated grid line 24 passing through the center NXo of the needle reference axis NX in the first image ID 1. The simulated grid line 24 is a line passing through the center NXo and parallel to the Xc axis.
Next, the calibration unit 13B calculates an intersection P1 of the pseudo grid line 24 and the uppermost grid line 22A which is arranged on the most + Xc side (+ Zm side) in the first image ID1 and is substantially parallel to the XmYm plane. The calibration unit 13B calculates an intersection point P2 of the pseudo grid line 24 and the lowermost grid line 22B which is arranged on the most-Xc side (-Zm side) in the first image ID1 and is substantially parallel to the XmYm plane (step SA 23).
Next, the calibration unit 13B calculates the pixel rate based on the pixel distance between the intersection point P1 and the intersection point P2 and the actual distance between the intersection point P1 and the intersection point P2 (step SA 24).
The pixel rate calculation processing SA2 of the second image pickup device 12 is the same as the pixel rate calculation processing SA2 of the first image pickup device 11. Description of the pixel ratio calculation processing SA2 of the second image pickup device 12 is omitted.
(calculation of relative Camera Angle)
After the pixel rate calculation process SA2 ends, a relative camera angle calculation process SA3 indicating the relative angle between the first imaging device 11 and the second imaging device 12 is performed. The calibration tool 20 described with reference to fig. 7 is used in the relative camera angle calculation process SA 3.
Fig. 16 is a diagram for explaining the relative camera angle calculation processing SA3 according to the embodiment. Fig. 17 is a diagram showing an example of first image data of the calibration jig 20 obtained by the first imaging device 11 in the relative camera angle calculation processing SA3 according to the embodiment. Fig. 17 shows the first image ID1 of the calibration jig 20 obtained by the first imaging device 11 in the state shown in fig. 16. Fig. 18 is a diagram for explaining the distance Vsz and the distance Ksz according to the embodiment.
The imaging surface of the image sensor of the first imaging device 11 is parallel to the Xc axis and the Yc axis, respectively. As shown in fig. 16, an angle θ formed by an Xc axis parallel to the imaging plane and the first optical axis AX1 can be calculated based on the distance Ksz and the distance Vsz.
As shown in fig. 16 and 18, the distance Ksz is the Yc-axis direction dimension of the field of view of the first imaging device 11 in the camera coordinate system. The distance Ksz is the distance between the point Pr representing the + Yc-side end of the field of view and the point Pl representing the-Yc-side end of the field of view. Distance Vsz is the distance of point Pr from point Pl.
As shown in fig. 18, a virtual vertical field Rv at the predetermined point Pr. A hypothetical vertical field of view Lv at a prescribed point Pl. The virtual vertical field of view is a field of view in the image vertical direction (Yc direction) when the calibration jig 20 is assumed to be parallel to the imaging plane.
< needle check treatment >
The needle check processing SB will be described next. The needle check processing SB is processing for checking the needle 3 based on the first image ID1 of the needle 3 obtained by the first imaging device 11 and the second image ID2 of the needle 3 obtained by the second imaging device 12.
Fig. 19 is a flowchart showing a needle checking process SB according to the embodiment. As shown in fig. 19, the needle check processing SB includes needle registration processing SB1 and needle check processing SB 2.
Fig. 20 is a flowchart showing the needle registration processing SB1 according to the embodiment. The needle registration processing SB1 includes a process of photographing the normal reference stitch 3R. In the embodiment, the reference needle 3R is a new needle 3. The reference sewing needle 3R is held by the needle bar 4. The first imaging device 11 and the second imaging device 12 respectively image the reference sewing needle 3R held by the needle bar 4.
The first image ID1 of the reference needle 3R obtained by the first imaging device 11 and the second image ID2 of the reference needle 3R obtained by the second imaging device 12 are obtained by the image obtaining section 13A (step SB 11).
The reference feature amount calculation unit 13D calculates a first reference feature amount based on the first image ID1 of the reference needle 3R held by the needle bar 4, and calculates a second reference feature amount based on the second image ID2 of the reference needle 3R held by the needle bar 4 (step SB 12).
Fig. 21 is a diagram for explaining reference characteristic amounts of the sewing needle 3 according to the embodiment. The reference feature value includes at least one of the first reference feature value and the second reference feature value. As shown in fig. 21, the reference feature value of the sewing needle 3 includes: a length LR of the reference needle 3R, a minimum width AR indicating a minimum value of the width of the reference needle 3R, a maximum width BR indicating a maximum value of the width of the reference needle 3R, and a needle point angle CR indicating an angle of a tip portion of the reference needle 3R.
The reference feature amount storage unit 13E stores the first reference feature amount and the second reference feature amount calculated by the reference feature amount calculation unit 13D at step SB12 (step SB 13).
Fig. 22 is a flowchart showing the needle inspection process SB2 according to the embodiment. The needle inspection process SB2 includes a process of detecting the needle 3S to be inspected at an arbitrary timing. In the embodiment, the detection stitch 3S is the reference stitch 3R after use. That is, in the embodiment, the reference stitch 3R and the detection stitch 3S are the same stitch 3. The first imaging device 11 and the second imaging device 12 each image the detection needle 3S held by the needle bar 4 at an arbitrary timing.
The first image ID1 of the inspection stitch 3S obtained by the first imaging device 11 and the second image ID2 of the inspection stitch 3S obtained by the second imaging device 12 are obtained by the image obtaining section 13A (step SB 21).
The detection feature amount calculation unit 13F calculates a first detection feature amount based on the first image ID1 of the inspection needle 3S held by the needle bar 4, and calculates a second detection feature amount based on the second image ID2 of the inspection needle 3S held by the needle bar 4 (step SB 22).
Fig. 23 is a diagram for explaining the detection feature amount of the sewing needle 3 according to the embodiment. The detection feature amount includes at least one of the first detection feature amount and the second detection feature amount. As shown in fig. 23, the detection feature amount of the sewing needle 3 includes: the length LS of the needle 3S, the minimum width AS of the needle 3S, the maximum width BS of the needle 3S, the needle point angle CS indicating the angle of the tip of the needle 3S, and the deformation amount DS of the needle 3 by bending deformation are detected.
The inspection unit 13G compares the first detected feature amount calculated by the detected feature amount calculation unit 13F at step SB22 with the first reference feature amount stored in the reference feature amount storage unit 13E at step SB13, and compares the second detected feature amount calculated by the detected feature amount calculation unit 13F at step SB22 with the second reference feature amount stored in the reference feature amount storage unit 13E at step SB13 (step SB 23).
The inspection unit 13G outputs inspection data indicating the inspection result of the detection needle 3S based on the comparison result at step SB23 (step SB 24).
In the embodiment, the inspection data output from the inspection unit 13G includes the bending of the non-detection stitch 3S, the blunting of the tip of the non-detection stitch 3S, and the bending deformation of the non-detection stitch 3S. When the blunting of the tip portion of the needle 3S is detected, the inspection data output from the inspection unit 13G includes the blunting amount of the needle 3S. When the bending deformation of the needle 3S is detected, the inspection data output from the inspection unit 13G includes the deformation amount DS of the needle 3S to be detected.
The inspection unit 13G can determine whether or not the detection needle 3S is bent by comparing the length LR stored in the reference feature amount storage unit 13E with the length LS calculated by the detection feature amount calculation unit 13F. When the difference between the length LS and the length LR is equal to or less than a predetermined length threshold, the inspection unit 13G determines that the detection needle 3S is not bent. When the difference between the length LS and the length LR is greater than the length threshold, the inspection unit 13G determines that the needle 3S is bent.
The inspection unit 13G can determine the presence or absence of blunting of the distal end portion of the detection suture needle 3S by comparing the needle tip angle CR stored in the reference characteristic amount storage unit 13E with the needle tip angle CS calculated by the detection characteristic amount calculation unit 13F. When the difference between the needle point angle CS and the needle point angle CR is equal to or smaller than a predetermined angle threshold, the inspection unit 13G determines that the tip portion of the detection needle 3S is not blunted. When the difference between the needle point angle CS and the needle point angle CR is larger than the angle threshold, the inspection unit 13G determines that the tip of the detection needle 3S is blunt.
The inspection unit 13G can determine the presence or absence of the bending deformation of the detection needle 3S by comparing the position of the needle reference axis NX stored in the calibration storage unit 13C with the position of the distal end portion of the detection needle 3S calculated by the detection feature amount calculation unit 13F. When the distance (amount of deviation) between the position of the needle reference axis NX and the position of the tip of the detection needle 3S is equal to or less than a predetermined distance threshold value, the inspection unit 13G determines that the detection needle 3S is not bent. When the distance (amount of deviation) between the position of the needle reference axis NX and the position of the tip of the detection needle 3S is larger than the distance threshold value, the inspection unit 13G determines that the bending deformation of the detection needle 3S is detected.
When determining that the bending deformation of the detection needle 3S is detected, the inspection unit 13G can calculate the deformation amount DS based on the distance (amount of displacement) between the needle reference axis NX and the tip end of the detection needle 3S.
[ methods for calculating feature quantities and inspection data ]
An example of the above-described method for calculating the feature amount (reference feature amount and detection feature amount) will be described below. A method of calculating the detection feature amount by the detection feature amount calculation unit 13F will be described below. The method of calculating the reference feature value by the reference feature value calculating unit 13D is substantially the same as the method of calculating the detection feature value by the detection feature value calculating unit 13F. In the following description, a method of calculating the first detection feature amount from the first image ID1 by the detection feature amount calculation unit 13F will be described. The method of calculating the second detection feature amount from the second image ID2 by the detection feature amount calculation section 13F is substantially the same as the method of calculating the first detection feature amount from the first image ID1 by the detection feature amount calculation section 13F.
Fig. 24 is a diagram showing an example of the first image ID1 of the needle 3 according to the embodiment. The first image ID1 shown in fig. 24 is obtained by the first image pickup device 11. The image obtaining section 13A obtains the first image ID1 from the first imaging device 11.
Fig. 25 and 26 are diagrams each showing an example of a method of processing a first image according to the embodiment. As shown in fig. 25, the detected feature amount calculation unit 13F performs edge extraction processing on the first image ID 1. Thereby, the outer shape of the sewing needle 3 is obtained. When the needle 3 is threaded with the upper thread, the edge portion where the upper thread exists is extracted discontinuously. When there is a discontinuous edge section EA, the detection feature amount calculation unit 13F may interpolate an edge point on the discontinuous edge section EA. By extracting the edge of the stitch 3, the detected feature amount calculating unit 13F can calculate the length LS, the minimum width AS, and the maximum width BS AS shown in fig. 26.
Further, the detected feature amount calculation unit 13F sets a window in a predetermined region including the tip portion of the needle 3 in the first image ID1, and finely extracts an edge in the window. The detection feature amount calculation unit 13F can calculate the needle point angle CS by finely extracting the edge of the distal end portion of the needle 3.
[ method for calculating the amount of deformation of a needle subjected to bending deformation ]
Next, a method of calculating the actual deformation amount ds (dsa) of the needle 3 that is bent and deformed will be described. Fig. 27 is a diagram showing an example of a method of calculating the deformation amount DS of the needle 3 subjected to bending deformation according to the embodiment. In fig. 27, the angle θ is an angle formed by the first optical axis AX1 and the second optical axis AX 2. In an embodiment, the angle θ is 90[ ° ]. The distortion amount DS in the first image ID1 is set as the distortion amount DSL, and the distortion amount DS in the second image ID2 is set as the distortion amount DSR. The actual deformation amount DS of the distal end portion of the needle 3 is set as the deformation amount DSA.
In addition, a sign of the bending direction of the needle 3 is defined in the first image ID 1. As shown in fig. 27, a direction closer to the second optical axis AX2 than the first optical axis AX1 is set as a + direction, and a direction farther from the second optical axis AX2 than the first optical axis AX1 is set as a-direction.
Likewise, the sign of the bending direction of the needle 3 is defined in the second image ID 2. As shown in fig. 27, a direction closer to the first optical axis AX1 than the second optical axis AX2 is set as a-direction, and a direction farther from the first optical axis AX1 than the second optical axis AX2 is set as a + direction.
The following expressions (1) and (2) hold between the deformation amount DSA, the deformation amount DSL, the deformation amount DSR, and the angle θ. (1) The expression is established when the sign of the curving direction of the needle 3 in the first image ID1 is the same as the sign of the curving direction of the needle 3 in the second image ID 2. (2) The expression is established when the sign of the curving direction of the needle 3 in the first image ID1 is different from the sign of the curving direction of the needle 3 in the second image ID 2.
[ number 1 ]
(DSA)2=(DSR)2+(DSL)2+2×DSR+DSL×cosθ…(1)
Number 2
(DSA)2=(DSR)2+(DSL)2-2×DSR×DSL×cosθ…(2)
In this way, in the embodiment, the first imaging device 11 and the second imaging device 12 are arranged such that the first optical axis AX1 and the second optical axis AX2 are orthogonal to each other on the needle 3, and therefore, when the needle is deformed by bending, the deformation amount DSA can be calculated with high accuracy.
[ Effect ]
As described above, the first imaging device 11 and the second imaging device 12 are arranged such that the first optical axis AX1 and the second optical axis AX2 are orthogonal to each other on the needle 3, and therefore the needle 3 can be inspected with high accuracy.
The needle 3 is held by the needle bar 4 so as to extend in the Zm axis direction orthogonal to the XmYm plane. The first optical axis AX1 and the second optical axis AX2 are parallel to the XmYm plane, respectively. Therefore, the processing device 13 can accurately determine the presence or absence of the bending deformation of the seamless needle 3. In addition, the processing device 13 can calculate the deformation amount DS of the curved and deformed sewing needle 3 with high accuracy
(DSA)。
The first detection feature amount and the first reference feature amount are calculated from the first image ID1 obtained by the first imaging device 11. By comparing the first detection feature amount with the first reference feature amount, the state of the sewing needle 3 when viewed from the first imaging device 11 is calculated. The second detection feature amount and the second reference feature amount are calculated from the second image ID2 obtained by the second imaging device 12. By comparing the second detection feature amount with the second reference feature amount, the state of the needle 3 when viewed from the second imaging device 12 is calculated. Since the state of the sewing needle 3 when viewed from both the first imaging device 11 and the second imaging device 12 is calculated, the state of the sewing needle 3 can be determined with high accuracy.

Claims (8)

1. A needle inspection device is provided with:
a first imaging device for imaging a needle held by a needle bar of a sewing machine;
a second imaging device that images the needle; and
a processing device that outputs inspection data of the needle based on a first image obtained by the first imaging device and a second image obtained by the second imaging device,
the first imaging device and the second imaging device are arranged such that a first optical axis representing an optical axis of the first imaging device and a second optical axis representing an optical axis of the second imaging device are orthogonal on the sewing needle.
2. The needle inspection device of claim 1,
the needle is held by the needle bar so as to extend in a direction orthogonal to the predetermined plane,
the first optical axis and the second optical axis are parallel to the predetermined surface.
3. The needle inspection device according to claim 1 or 2,
the processing device has:
an image obtaining unit that obtains the first image of the needle from the first imaging device and obtains a second image of the needle from the second imaging device;
a detection feature amount calculation unit that performs image processing on the first image obtained by the image obtaining unit to calculate a first detection feature amount of the needle, and performs image processing on the second image obtained by the image obtaining unit to calculate a second detection feature amount of the needle;
a reference feature value storage unit that stores a first reference feature value and a second reference feature value of the sewing needle; and
and an inspection unit that outputs the inspection data by comparing the first detection feature quantity with the first reference feature quantity and by comparing the second detection feature quantity with the second reference feature quantity.
4. The needle inspection device of claim 3,
the needle comprises a reference needle and a detection needle of an inspection object,
the processing device has:
a reference feature amount calculation unit that calculates the first reference feature amount based on the first image of the reference needle held by the needle bar and calculates the second reference feature amount based on the second image of the reference needle,
the reference feature storage unit stores the first reference feature and the second reference feature calculated by the reference feature calculation unit.
5. The needle inspection device of claim 4,
the reference stitch and the detection stitch are the same stitch,
the reference stitch is a normal stitch,
the detection stitch is the reference stitch after use.
6. The needle inspection device according to any one of claims 1 to 5,
the inspection data includes an abnormality of the absence of the sewing needle.
7. The needle inspection device of claim 6,
the abnormality of the needle includes a bending deformation of the needle.
8. The needle inspection device of claim 7,
the inspection data includes a deformation amount of the needle which is bent and deformed.
CN202011299708.3A 2019-11-25 2020-11-19 Needle inspection device Active CN112833806B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019212194A JP7377079B2 (en) 2019-11-25 2019-11-25 Sewing needle inspection device
JP2019-212194 2019-11-25

Publications (2)

Publication Number Publication Date
CN112833806A true CN112833806A (en) 2021-05-25
CN112833806B CN112833806B (en) 2025-03-04

Family

ID=75923139

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011299708.3A Active CN112833806B (en) 2019-11-25 2020-11-19 Needle inspection device

Country Status (2)

Country Link
JP (1) JP7377079B2 (en)
CN (1) CN112833806B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114717832A (en) * 2022-04-18 2022-07-08 江苏天鸟高新技术股份有限公司 Rapid detection method for broken needles for spinning

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1282416A (en) * 1997-10-20 2001-01-31 帕特辛·豪 Optical encoder with at least one lens sheet
CN1453559A (en) * 2003-03-08 2003-11-05 东风汽车公司 Measuring method of bending deformation of crank shaft
EP1677070A1 (en) * 2005-01-03 2006-07-05 Kamax-Werke Rudolf Kellermann GmbH & Co. KG Method and device for determining the deflection of a connecting element
JP2009189551A (en) * 2008-02-14 2009-08-27 Brother Ind Ltd Sewing machine
CN101530864A (en) * 2008-03-14 2009-09-16 株式会社英田精密机械 Shape calculating system
JP2011117817A (en) * 2009-12-03 2011-06-16 Si Seiko Co Ltd Article inspection device
US20120104267A1 (en) * 2010-10-29 2012-05-03 Canon Kabushiki Kaisha Imaging apparatus, radiation imaging system, and control method of image sensor
US20120275776A1 (en) * 2011-04-28 2012-11-01 Centre De Recherche Industrielle Du Quebec Camera enclosure assembly
CN104483331A (en) * 2014-12-03 2015-04-01 东莞市神州视觉科技有限公司 A three-dimensional detection method, device and system for connector pins
CN104913739A (en) * 2015-06-26 2015-09-16 北方工业大学 Visual measurement method and device for eccentricity of crank throw of crankshaft
WO2015139505A1 (en) * 2014-03-20 2015-09-24 Harbin Institute Of Technology Method and equipment based on detecting the polarization property of a polarization maintaining fiber probe for measuring structures of a micro part
CN104990498A (en) * 2015-06-16 2015-10-21 广东电网有限责任公司电力科学研究院 Power plant boiler high-temperature pipe system macro displacement measurement device and method based on CCD photography
DE102015121582A1 (en) * 2014-12-12 2016-06-16 Werth Messtechnik Gmbh Method and device for measuring features on workpieces
CN107284455A (en) * 2017-05-16 2017-10-24 浙江理工大学 A kind of ADAS systems based on image procossing
CN107830813A (en) * 2017-09-15 2018-03-23 浙江理工大学 The longaxones parts image mosaic and flexural deformation detection method of laser wire tag
CN109295616A (en) * 2017-07-25 2019-02-01 Juki株式会社 sewing machine
CN208567810U (en) * 2018-07-04 2019-03-01 中国船舶重工集团公司第七一九研究所 A kind of abrasion of automation petroleum drilling and mining equipment and device for detecting deformation
CN109564167A (en) * 2016-08-18 2019-04-02 株式会社斯库林集团 Check device and inspection method
CN111801545A (en) * 2017-12-20 2020-10-20 株式会社新川 Line shape inspection device and line shape inspection method

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1282416A (en) * 1997-10-20 2001-01-31 帕特辛·豪 Optical encoder with at least one lens sheet
CN1453559A (en) * 2003-03-08 2003-11-05 东风汽车公司 Measuring method of bending deformation of crank shaft
EP1677070A1 (en) * 2005-01-03 2006-07-05 Kamax-Werke Rudolf Kellermann GmbH & Co. KG Method and device for determining the deflection of a connecting element
US20060144158A1 (en) * 2005-01-03 2006-07-06 Gunther Hartmann Method and apparatus for determining the deflection of a fastener
JP2009189551A (en) * 2008-02-14 2009-08-27 Brother Ind Ltd Sewing machine
CN101530864A (en) * 2008-03-14 2009-09-16 株式会社英田精密机械 Shape calculating system
JP2011117817A (en) * 2009-12-03 2011-06-16 Si Seiko Co Ltd Article inspection device
US20120104267A1 (en) * 2010-10-29 2012-05-03 Canon Kabushiki Kaisha Imaging apparatus, radiation imaging system, and control method of image sensor
US20120275776A1 (en) * 2011-04-28 2012-11-01 Centre De Recherche Industrielle Du Quebec Camera enclosure assembly
WO2015139505A1 (en) * 2014-03-20 2015-09-24 Harbin Institute Of Technology Method and equipment based on detecting the polarization property of a polarization maintaining fiber probe for measuring structures of a micro part
CN104483331A (en) * 2014-12-03 2015-04-01 东莞市神州视觉科技有限公司 A three-dimensional detection method, device and system for connector pins
DE102015121582A1 (en) * 2014-12-12 2016-06-16 Werth Messtechnik Gmbh Method and device for measuring features on workpieces
CN104990498A (en) * 2015-06-16 2015-10-21 广东电网有限责任公司电力科学研究院 Power plant boiler high-temperature pipe system macro displacement measurement device and method based on CCD photography
CN104913739A (en) * 2015-06-26 2015-09-16 北方工业大学 Visual measurement method and device for eccentricity of crank throw of crankshaft
CN109564167A (en) * 2016-08-18 2019-04-02 株式会社斯库林集团 Check device and inspection method
CN107284455A (en) * 2017-05-16 2017-10-24 浙江理工大学 A kind of ADAS systems based on image procossing
CN109295616A (en) * 2017-07-25 2019-02-01 Juki株式会社 sewing machine
CN107830813A (en) * 2017-09-15 2018-03-23 浙江理工大学 The longaxones parts image mosaic and flexural deformation detection method of laser wire tag
CN111801545A (en) * 2017-12-20 2020-10-20 株式会社新川 Line shape inspection device and line shape inspection method
CN208567810U (en) * 2018-07-04 2019-03-01 中国船舶重工集团公司第七一九研究所 A kind of abrasion of automation petroleum drilling and mining equipment and device for detecting deformation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114717832A (en) * 2022-04-18 2022-07-08 江苏天鸟高新技术股份有限公司 Rapid detection method for broken needles for spinning

Also Published As

Publication number Publication date
JP2021083452A (en) 2021-06-03
JP7377079B2 (en) 2023-11-09
CN112833806B (en) 2025-03-04

Similar Documents

Publication Publication Date Title
JP4811272B2 (en) Image processing apparatus and image processing method for performing three-dimensional measurement
US11972589B2 (en) Image processing device, work robot, substrate inspection device, and specimen inspection device
CN114274260B (en) Cutting machine and machine-readable carrier
JP6412730B2 (en) Edge position detection device, width measurement device, and calibration method thereof
JP5438475B2 (en) Gap step measurement device, gap step measurement method, and program thereof
JP5875272B2 (en) 3D measurement method
JP4623657B2 (en) Captured image processing apparatus and captured image processing method for electronic component mounter
JP4275149B2 (en) Boundary position determination apparatus, method for determining boundary position, program for causing computer to function as the apparatus, and recording medium
CN112833806B (en) Needle inspection device
CN116615302A (en) Method for detecting suspension position of support bar and flatbed machine tool
JP5545737B2 (en) Component mounter and image processing method
US20160116990A1 (en) Depth determining method and depth determining device of operating body
EP2975921A1 (en) Component recognition system for component mounting machine
KR101653861B1 (en) Drawing data generating method, drawing method, drawing data generating apparatus and drawing apparatus
JP2009192483A (en) 3D shape measuring method and 3D shape measuring apparatus
JP2008203214A (en) Work deformation/distortion detecting method
JP2014202661A (en) Range finder
WO2020217970A1 (en) Wire shape measurement device, wire three-dimensional image generation method, and wire shape measurement method
JP5572247B2 (en) Image distortion correction method
JP5730114B2 (en) Component rotation angle detection device, image processing component data creation device, component rotation angle detection method, and image processing component data creation method
CN116634134A (en) Imaging system calibration method and device, storage medium and electronic equipment
JP2007205868A (en) Optical shape inspection apparatus and method
JP4757701B2 (en) Electronic component suction position correction method and apparatus
EP4491998A1 (en) Measuring device and forming machine
JP2004094442A (en) Method and device for measuring number of sheet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant