[go: up one dir, main page]

CN117173225B - High-precision registration method for complex PCB - Google Patents

High-precision registration method for complex PCB

Info

Publication number
CN117173225B
CN117173225B CN202311239069.5A CN202311239069A CN117173225B CN 117173225 B CN117173225 B CN 117173225B CN 202311239069 A CN202311239069 A CN 202311239069A CN 117173225 B CN117173225 B CN 117173225B
Authority
CN
China
Prior art keywords
image
array
sub
matched
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311239069.5A
Other languages
Chinese (zh)
Other versions
CN117173225A (en
Inventor
王冬云
严志博
吴瀚洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Normal University CJNU
Original Assignee
Zhejiang Normal University CJNU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Normal University CJNU filed Critical Zhejiang Normal University CJNU
Priority to CN202311239069.5A priority Critical patent/CN117173225B/en
Publication of CN117173225A publication Critical patent/CN117173225A/en
Application granted granted Critical
Publication of CN117173225B publication Critical patent/CN117173225B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the field of PCB processing and image recognition, in particular to a high-precision registration method for a complex PCB. The method comprises the steps of a, generating a template image through a processing file of a PCB, collecting an image to be matched, b, identifying array data of array units in the template image, wherein the array data comprise array numbers and coordinates of all the array units, c, sub-dividing the image to be matched according to the array data obtained in the step b to obtain sub-images with the same array numbers, d, performing feature point matching with the array units and all the sub-images to obtain coordinate information of registration sub-images of the array units on the sub-images, and e, projectively transforming the registration sub-images onto the template image to establish a corresponding relation between the template image and the matched image.

Description

High-precision registration method for complex PCB
Technical Field
The invention relates to the field of PCB processing and image recognition, in particular to a high-precision registration method for a complex PCB.
Background
In recent years, computer vision has rapidly developed, and there are many image registration methods based on computer vision, wherein NCC (normalized cross correlation) method, SIFT (Scale-INVARIANT FEATURE TRANSFORM) method and SURF (Speeded Up Robust Features) method have good application in PCB bare board registration and defect detection, but at the same time, the methods have problems that firstly, the reality is reduced along with the increase of PCB surface features (points) in actual production, secondly, the algorithms are calculated based on similar features, and when PCB using a jigsaw process is encountered, feature superposition is easy to occur, so that projection matrix calculation errors are caused, and registration failure is caused.
In the prior art, patent documents with publication numbers CN 106373161B and CN 114897946A also exist, wherein feature points are used to extract camera image features and template features, then matching point pairs of the camera image and the template image are calculated, interior points are obtained through an algorithm of outlier rejection, and finally a homography matrix and a projection matrix are calculated through the interior points, so that a registration picture is obtained through projection transformation.
Both techniques are applicable to pcb boards (with small number of bonding pads, small area and independent single boards) with simpler conditions, and the method can obtain higher-precision matching and better instantaneity, but in a real industrial production environment (with uncertain number of bonding pads, large area and jointed boards), especially when the jointed boards are processed, a plurality of array units exist on the whole pcb, and the elements of each array unit are identical. The method has the advantages of high probability of matching error and low real-time performance.
Disclosure of Invention
In order to overcome the defects or shortcomings in the prior art, the invention provides a high-precision registration method for complex PCBs, which can rapidly identify array units with the same characteristics in PCBs of a jointed board in the PCBs produced by the jointed board, confirm the number and the positions of the array units in the PCB, and improve the image identification precision of the PCB of the jointed board.
In order to realize the aim of quickly carrying out image recognition on the jointed PCB, the invention provides a high-precision registration method for complex PCB, which comprises the following steps,
Step a, generating a template image through a processing file of the PCB, and collecting an image to be matched;
step b, identifying array data of the array units in the template image, wherein the array data comprises array numbers and coordinates of each array unit;
C, sub-image segmentation is carried out on the image to be matched according to the array data obtained in the step b, so that sub-images with the same number as the array are obtained;
step d, carrying out feature point matching by using the array unit and all the sub-images to obtain coordinate information of the sub-images registered by the array unit on the sub-images;
and e, projectively transforming the registration subgraph to the template image, and establishing a corresponding relation between the template image and the matching image.
Preferably, after the step d is finished, before the step e, performing a similarity check on the whole image of the registration subgraph and the array unit;
And respectively carrying out gray threshold binarization processing on the registration subgraph and the array unit, then carrying out bit operation xor on the processed registration subgraph and the processed array unit, and then summarizing the output result of the xor operation to obtain the similarity of the registration subgraph and the processed array unit.
Preferably, in step a, acquiring the image to be matched comprises the steps of:
Firstly, shooting a PCB board by using a camera to obtain an original image, and then determining a PCB region in the original image by using a foreground extraction algorithm and extracting to obtain an image to be matched.
The foreground extraction algorithm comprises the following steps:
s1, converting an acquired original image into a gray image;
s2, binarizing the gray level image to obtain a binary image of 0 or 1 after the pixels;
S3, denoising the binary image to obtain a noise-reduced image, performing open operation to remove background noise, and performing closed operation to remove foreground noise;
s4, extracting the contour, and extracting the maximum contour from the contour;
S5, calculating the perimeter of the maximum outline, using the perimeter as an approximate precision parameter to calculate an outline approximate polygon, and adopting a Douglas-Peucker algorithm to obtain four vertexes of the fitted quadrangle;
S6, taking two right-angle vertexes far away from the image edge of the four vertexes as circle centers, taking two thirds of the side length of the right-angle side as a radius to serve as an intersecting circle, and extracting an inner intersection point;
s7, extracting corresponding right-angle edges of the PCB template image, and taking right-angle vertexes as intersecting circles to extract inner intersecting points;
S8, calculating affine matrix by right-angle vertexes on the original image and the template image and intersection points in the intersecting circles,
S9, carrying out affine transformation on the noise reduction image, and obtaining an image to be matched from the original image.
Preferably, in the step b, array data of array units in the template image is identified by adopting an array image detection algorithm, and the array image detection algorithm comprises the following steps:
t1, cutting a plurality of slices from the X direction and the Y direction of the template image respectively;
T2, taking each slice as a template, and performing template matching on the whole graph;
t3, performing non-maximum suppression (NMS) on the matching result, and counting the number of entries in the X direction and the Y direction;
t4, counting the number of entries of each slice, and calculating the mode;
t5, using mode as array number of its direction, outputting array matrix;
t6, in each direction, taking the slice corresponding list matched with the mode as a sample, and calculating the difference value of the distances of two elements in the matched list of each slice in the direction;
t7, counting modes of all distance differences in each direction to obtain accurate differences corresponding to the directions;
T8. integrating the difference values in X, Y direction to obtain the length and width of the array unit;
T9. sliding an array template image with the index of (0, 0) in the X direction, simultaneously sliding an image with the index of (n, m), comparing the two images, and recording the similarity to a list;
T10. Sliding the array template image with the index of (0, 0) in the Y direction, comparing the two images with the index of (n, m), and recording the similarity to the list;
t11. calculating the maximum value in the list of X directions to obtain the starting point of the X directions;
t12, calculating the maximum value in the Y-direction list to obtain a starting point of the Y direction;
And T13, integrating the starting points, the array number and the whole row unit size of the X and Y directions to obtain output array information.
Preferably, in the feature matching in the step d, each sub-image is independently performed, and acceleration is performed in parallel by using multiple threads.
Preferably, in the step d, SIFT, SURF, ORB is adopted as an algorithm for extracting the feature points;
The method comprises the steps of matching detected feature points in two subgraphs by using a near nearest neighbor search algorithm (FLANN), wherein the feature points mainly comprise imported feature descriptors, tree structure index establishment, tree space search, nearest neighbor descriptor matching score calculation and final matching result generation.
Preferably, in step d, the RANSAC algorithm is used to optimize the matching point set, reject "outliers", and preserve "interior points", and the main steps are:
selecting a minimum data set from which the model can be estimated;
All data are brought into the model, the number of 'interior points' is calculated, the number of 'interior points' of the current model and the best model which is pushed out before are compared, the model parameter of the maximum 'interior point' number and the number of 'interior points' are recorded, and the steps are repeated until iteration is finished.
Preferably, the step e comprises the steps of calculating a projected vertex matrix through the homography matrix and the quadrate vertex matrix of the original image region subgraph, and then carrying out projective transformation on each subgraph through the projective matrix of each subgraph.
Preferably, step e is followed by step f,
And f, splicing all sub-graphs subjected to projection transformation to form a large graph matched with high precision.
The high-precision registration method for the complex PCB can adapt to the situation that the jointed PCB has a large number of repeated elements, and has the advantages of high recognition speed, high precision and automatic running of programs.
Specifically, a high-precision template image is directly generated by using a processing file of a PCB, and then heavy array units, the number of horizontal arrays and the number of vertical arrays of the array units and the coordinates of each array unit in the production of jointed boards are identified on the template image.
And cutting the image to be matched, which is directly shot by the PCB, according to the result of array identification, and then carrying out independent feature matching on each cut sub-image and the array unit to determine the corresponding position of the array unit on the sub-image, namely a registration sub-image, and simultaneously obtaining the coordinate information of the registration sub-image, and then carrying out projection change according to the position of the registration image and the template image. Therefore, the PCB processing file and the template file establish a corresponding relation, and then the template file and the image to be matched establish a corresponding relation.
Additional features and advantages of the invention will be set forth in the detailed description which follows.
Drawings
FIG. 1 is a flow chart of one embodiment of the present invention.
Detailed Description
The following describes specific embodiments of the present invention in detail. It should be understood that the detailed description and specific examples, while indicating and illustrating the invention, are not intended to limit the invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the invention herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
To solve the problems of conventional PCB processing referred to in the background section. The invention provides a high-precision registration method for a complex PCB, which comprises the following steps.
Step a, generating a template image through a processing file of the PCB, and collecting an image to be matched;
step b, identifying array data of the array units in the template image, wherein the array data comprises array numbers and coordinates of each array unit;
C, sub-image segmentation is carried out on the image to be matched according to the array data obtained in the step b, so that sub-images with the same number as the array are obtained;
step d, carrying out feature point matching by using the array unit and all the sub-images to obtain coordinate information of the sub-images registered by the array unit on the sub-images;
and e, projectively transforming the registration subgraph to the template image, and establishing a corresponding relation between the template image and the matching image.
In step a, generating a template image includes the process of reading a processed file of the PCB and automatically converting it into a simulated template image, specifically, reading a GERBER file, automatically drawing it on a high resolution virtual canvas using a program, and outputting it in a picture format.
In step a, acquiring the image to be matched comprises the steps of firstly shooting a PCB board by using a camera to acquire an original image. The original image includes a PCB portion, a processing platform, a fixture, and the like, so that a foreground extraction algorithm is required to determine a PCB area in the original image and extract the PCB area to obtain an image to be matched.
The foreground extraction algorithm comprises the following steps:
S1, an original image directly shot by an industrial camera is named as a PCB-IMG (printed Circuit Board-matrix image) which is a color image and comprises color information, wherein the acquired original image is converted into a GRAY image, the color space conversion is carried out on the PCB-IMG, and the color space conversion is carried out on the PCB-IMG, so that the RGB color space is converted into a GRAY color space, and the formula is Gray=R 0.299+G 0.587+B 0.114. Obtaining the PCB-IMG-GRAY.
S2, binarizing the GRAY level image to obtain a BINARY image of 0 or 1 after pixels, binarizing the PCB-IMG-GRAY by using an image threshold value, setting a part larger than the threshold value to be 255, and setting a part smaller than the threshold value to be 0 to obtain the PCB-IMG-GRAY-BINARY. This converts the gray map into a binary map, i.e. an image with only 0 and 1's of the images.
S3, denoising the binary image to obtain a noise-reduced image, performing open operation to remove background noise, and performing closed operation to remove foreground noise, wherein the step of removing noise (black background white point or white background black point) in the unprocessed image is to remove the noise. The method comprises the following steps:
Removing noise by morphological operation, firstly performing open operation to remove background noise, and then performing close operation to remove foreground noise to obtain the PCB-IMG-GRAY-BINARY-FLITED, wherein the open operation is erosion and then expansion, the close operation is expansion and then erosion, the expansion is to widen the image to a certain extent, namely A is B, and the expansion of the set A is described as the expansion of the set B, and the formula is as follows:
The corrosion is to reduce the image to a certain extent, namely AΘB is described as the corrosion of the set B to the set A, and the formula is as follows:
s4, extracting the outline and extracting the maximum outline from the outline, namely extracting the outline of the PCB-IMG-GRAY-BINARY-LITED, and extracting the maximum outline MAX_CNT from the outline, wherein the extracted maximum outline is the outline of the PCB and comprises a closed point set capable of enclosing the target.
S5, calculating the perimeter of the maximum outline, calculating the outline approximate polygon by taking the perimeter as an approximate precision parameter, adopting a Douglas-Peucker algorithm to obtain four vertexes of the fitted quadrangle, specifically, calculating the perimeter of MAX_CNT, calculating the outline approximate polygon by taking the perimeter as an approximate precision parameter, and adopting a Douglas-Peucker algorithm to obtain four vertexes of the fitted quadrangle. The purpose of this is to compute the vertices of the contour image of the PCB from the point set for subsequent transformation.
S6, taking two right-angle vertexes far away from the image edge of the four vertexes as circle centers, taking two thirds of the side length of the right-angle side as a radius to serve as an intersecting circle, and extracting an inner intersection point;
In the specific operation, two right-angle vertexes far away from the image edge in four vertexes are used as circle centers, two thirds of the side length of the right-angle edge is used as a radius to make an intersecting circle, the annotation of an inner intersection point S7 is extracted, because a lattice consisting of at least 3 points and a mapping lattice corresponding to the lattice are needed for affine transformation, only two points of the edge are insufficient,
The purpose of the circle is to extract the third point, no matter what transformation the graph is transformed, the relative relationship of this point to the edge is unchanged.
S7, extracting corresponding right-angle edges of the PCB template image, taking right-angle vertexes as intersecting circles, and extracting inner intersecting points, wherein the operation method and the purpose are the same as those of the step S6.
S8, calculating affine matrix by using right-angle vertexes on the original image and the template image and intersection points in the intersection circles, wherein the calculation method is as follows
Wherein the method comprises the steps of
And S9, carrying out affine transformation on the noise reduction image, and obtaining an image to be matched from the original image. That is, affine transformation is performed on PCB-IMG-GRAY-BINARY-FLITED to obtain a roughly registered image to be matched from the original image.
Array image detection algorithm
In the step b, firstly, a template image is obtained from a processing file of the PCB, and then array data of array units in the template image are identified by adopting an array image detection algorithm, wherein the array image detection algorithm comprises the following steps:
and T1, cutting a plurality of slices from the X direction and the Y direction of the template image respectively, extracting the slices from the X side and the Y side of the template image, wherein the number of the slices is adjustable, the number of the slices is also adjustable, the slice width is 3-9, and each slice width is 20-100 pixels. The slice extends through the entire template image in the longitudinal direction, and a plurality of slices in a preferably uniform direction are adjacent.
Before T1 is performed, the edges of the template image need to be removed, that is, the part of the actual PCB that does not contain components and wires is removed, which is not suitable for the contrast material used for the image because it does not contain any image features. A specific method may be to use not to start from the edge at the time of slicing, but to empty several pixels, e.g. 10-50 pixels, from the edge and then go to the first slice.
The X-direction and the Y-direction are here two directions perpendicular to each other or a direction which is understood to be the long pointing direction of the image and the wide pointing direction of the image.
T2, taking each slice as a template, and performing template matching on the whole graph;
And respectively taking each slice as a template, performing template matching in the whole image range, and obtaining a matrix with the shape of n-4*2 as a matched output result, wherein n represents the number of matched images, 4 represents four vertex frames of the matched images, and 2 represents X and Y coordinate values. And standard correlation matching is adopted, and the formula is as follows:
Specifically, there are six methods for matching gray histograms, when :"cv.TM_CCOEFF,cv.TM_CCOEFF_NORMED,cv.TM_CCORR,cv.TM_CCORR_NORMED,cv.TM_SQDIFF,cv.TM_SQDIFF_NORMED", are matched respectively, two pictures (template and large picture to be matched) are input, n objects meeting the set matching threshold are output,
This step is to find out how many places the slicing template is present in the full view, and thus how many areas in the image are identical to the slice length.
And T3, performing non-maximum suppression (NMS) on the matching result, counting the number of entries in the X direction and the Y direction, and performing NMS on the fact that in T2, the number of matched items with the largest matching score is reserved in one area because the number of matched n areas in the whole graph is possibly overlapped, so as to obtain the accurate number of the slice templates in the whole graph.
And T4, counting the number of entries of each slice to calculate the mode, wherein the execution object of the step T2-T3 is a single slice, and the number of the slices is plural in the step T1, so that the operation of the step T3-T4 is carried out once on each slice to form a sample set with the number of the slices, and the number of arrays is not necessarily recorded in each sample set completely and correctly, and the mode statistics is carried out on the sample set to eliminate the uncertainty, so that the more accurate real array number of the direction on the whole graph can be obtained.
And T5, adopting the mode as the array number in the direction of the PCB, and outputting an array matrix, wherein the array matrix in the step only has the number information of the arrays and no coordinate information, and the PCB has edges for mechanical clamping during the generation, so that the edges are firstly removed in the step T1 and cut and extracted, and the array starting point is not necessarily at the (0, 0) position.
And T6, taking a slice corresponding list of matching modes as a sample in each direction, namely the X direction and the Y direction, and calculating the difference value of distances between two elements in the matched list of each slice in the direction, wherein the difference value of distances between adjacent matched samples is the width of an array unit in the direction.
And T7, counting modes of all distance differences in each direction to obtain accurate differences corresponding to the directions, wherein as a plurality of slices are taken, calculation of the width of a single slice can have deviation, so that the mode is taken to eliminate errors.
And T8, integrating the difference values in the X, Y directions to obtain the length and the width of the array unit.
T9, sliding an array template image with the index of (0, 0) in the X direction, simultaneously, comparing the array template image with the index of (n, m), and recording the similarity to a list;
T10, sliding an array template image with the index of (0, 0) in the Y direction, simultaneously, comparing the array template image with the index of (n, m), and recording the similarity to a list;
t11, calculating the maximum value in the list of the X direction to obtain a starting point of the X direction;
t12, calculating the maximum value in the Y-direction list to obtain a starting point of the Y direction;
And T13, integrating the data in the T8-T12, namely integrating the starting points, the array number and the whole array unit size of the X and Y directions to obtain the output array information. The information obtained includes an image of an array element, the number of arrays of array elements in the X and Y directions, and the location of each array element.
And d, after the step d is finished, performing full-image similarity check on the registration subgraph and the array unit before the step e, respectively performing gray threshold binarization processing on the registration subgraph and the array unit, performing bit operation xor on the processed registration subgraph and the processed array unit, and then summarizing the output result of the xor operation to obtain the similarity of the registration subgraph and the array unit.
Specifically, an image b of a registration sub-image a and an array unit is obtained, gray threshold binarization processing is carried out on the image a and the image b respectively to obtain a-t and b-t, bit operation xor (exclusive or) is carried out on the image a-t and the image b-t to obtain an image ab-xor, white pixels in the image represent different parts in the image a-t and the image b-t, and the difference between the two images after matching can be obtained by calculating the area of the white pixels.
And d, when the characteristics are matched, each sub-image is independently processed, and acceleration is performed in parallel by adopting multiple threads, wherein the specific formula is as follows:
l is a Gaussian blur image, G is a Gaussian kernel function,
I is original picture
D is a DoG image
The feature descriptors are the gray gradient directions of the surrounding pixels:
1) The obtained angle value was subjected to 36-degree aliquoting.
2) And calculating gradient values in the scale space corresponding to the characteristic points.
3) The weights were calculated using gaussian kernel gradients.
That is, the weight of the pixels surrounding the pixel is determined by two values, one is the magnitude of the gradient itself and the second is the distance from the pixel under investigation.
In the step d, SIFT, SURF, ORB is adopted as an extraction algorithm of the feature points;
The method comprises the steps of matching detected feature points in two subgraphs by using a near nearest neighbor search algorithm (FLANN), wherein the feature points mainly comprise imported feature descriptors, tree structure index establishment, tree space search, nearest neighbor descriptor matching score calculation and final matching result generation.
Then in step d, the RANSAC algorithm is used for optimizing a matching point set, eliminating 'outer points', and reserving 'inner points', wherein the method mainly comprises the following steps:
selecting a minimum data set from which the model can be estimated;
All data are brought into the model, the number of 'interior points' is calculated, the number of 'interior points' of the current model and the best model which is pushed out before are compared, the model parameter of the maximum 'interior point' number and the number of 'interior points' are recorded, and the steps are repeated until iteration is finished.
Then, calculating homography matrix from original image region subgraph to template subgraph coordinates through the interior point set, and finding and returning a conversion matrix H between a source plane and a target plane by the algorithm
The back projection error rate is calculated as follows:
and calculating the projected vertex matrix through the homography matrix and the vertex matrix at four corners of the subgraph of the original image area, and then carrying out projection transformation on each subgraph through the projection matrix of each subgraph.
And step f, splicing all sub-graphs subjected to projective transformation to form a high-precision matched large graph. The final registered large map is not necessary and is generated if necessary.
The preferred embodiments of the present invention have been described in detail above, but the present invention is not limited to the specific details of the above embodiments, and various simple modifications can be made to the technical solution of the present invention within the scope of the technical concept of the present invention, and all the simple modifications belong to the protection scope of the present invention.
In addition, the specific features described in the above embodiments may be combined in any suitable manner without contradiction. The various possible combinations of the invention are not described in detail in order to avoid unnecessary repetition.
Moreover, any combination of the various embodiments of the invention can be made without departing from the spirit of the invention, which should also be considered as disclosed herein.

Claims (9)

1. A high-precision registration method for complex PCBs is characterized by comprising the following steps,
Step a, generating a template image through a processing file of the PCB, and collecting an image to be matched;
step b, identifying array data of the array units in the template image, wherein the array data comprises array numbers and coordinates of each array unit;
C, sub-image segmentation is carried out on the image to be matched according to the array data obtained in the step b, so that sub-images with the same number as the array are obtained;
step d, carrying out feature point matching by using the array unit and all the sub-images to obtain coordinate information of the sub-images registered by the array unit on the sub-images;
Step e, projectively transforming the registration subgraph to the template image, and establishing a corresponding relation between the template image and the matching image;
In the step b, array data of array units in the template image are identified by adopting an array image detection algorithm, wherein the array image detection algorithm comprises the following steps:
t1, cutting a plurality of slices from the X direction and the Y direction of the template image respectively;
T2, taking each slice as a template, and performing template matching on the whole graph;
t3, performing non-maximum suppression (NMS) on the matching result, and counting the number of entries in the X direction and the Y direction;
t4, counting the number of entries of each slice, and calculating the mode;
t5, using mode as array number of its direction, outputting array matrix;
t6, in each direction, taking the slice corresponding list matched with the mode as a sample, and calculating the difference value of the distances of two elements in the matched list of each slice in the direction;
t7, counting modes of all distance differences in each direction to obtain accurate differences corresponding to the directions;
T8. integrating the difference values in X, Y direction to obtain the length and width of the array unit;
T9. sliding an array template image with the index of (0, 0) in the X direction, simultaneously sliding an image with the index of (n, m), comparing the two images, and recording the similarity to a list;
T10. Sliding the array template image with the index of (0, 0) in the Y direction, comparing the two images with the index of (n, m), and recording the similarity to the list;
t11. calculating the maximum value in the list of X directions to obtain the starting point of the X directions;
t12. calculating the maximum value in the Y-direction list to obtain the starting point of the Y-direction;
And T13, integrating the starting points, the array number and the whole row unit size of the X and Y directions to obtain output array information.
2. The high precision registration method for complex PCBs of claim 1, wherein after step d is completed, a full-view similarity check is performed on the registration subgraph and the array unit before step e;
And respectively carrying out gray threshold binarization processing on the registration subgraph and the array unit, then carrying out bit operation xor on the processed registration subgraph and the processed array unit, and then summarizing the output result of the xor operation to obtain the similarity of the registration subgraph and the processed array unit.
3. A high precision registration method for complex PCBs according to claim 1, characterized in that in step a, acquiring the image to be matched comprises the steps of:
Firstly, shooting a PCB board by using a camera to obtain an original image, and then determining a PCB region in the original image by using a foreground extraction algorithm and extracting to obtain an image to be matched.
4. A high precision registration method for complex PCBs according to claim 3, characterized in that the foreground extraction algorithm comprises the steps of:
s1, converting an acquired original image into a gray image;
s2, binarizing the gray level image to obtain a binary image of 0 or 1 after the pixels;
S3, denoising the binary image to obtain a noise-reduced image, performing open operation to remove background noise, and performing closed operation to remove foreground noise;
s4, extracting the contour, and extracting the maximum contour from the contour;
S5, calculating the perimeter of the maximum outline, using the perimeter as an approximate precision parameter to calculate an outline approximate polygon, and adopting a Douglas-Peucker algorithm to obtain four vertexes of the fitted quadrangle;
S6, taking two right-angle vertexes far away from the image edge of the four vertexes as circle centers, taking two thirds of the side length of the right-angle side as a radius to serve as an intersecting circle, and extracting an inner intersection point;
s7, extracting corresponding right-angle edges of the PCB template image, and taking right-angle vertexes as intersecting circles to extract inner intersecting points;
S8, calculating affine matrix by right-angle vertexes on the original image and the template image and intersection points in the intersecting circles,
S9, carrying out affine transformation on the noise reduction image, and obtaining an image to be matched from the original image.
5. A high precision registration method for complex PCBs according to claim 1, characterized in that each sub-image is performed independently and accelerated in parallel using multithreading when performing feature matching in step d.
6. The high-precision registration method for complex PCBs according to claim 1, wherein in step d, the feature point extraction algorithm is SIFT, SURF, ORB;
The method comprises the steps of matching detected feature points in two subgraphs by using a near nearest neighbor search algorithm (FLANN), wherein the feature points mainly comprise imported feature descriptors, tree structure index establishment, tree space search, nearest neighbor descriptor matching score calculation and final matching result generation.
7. The high-precision registration method for complex PCBs according to claim 6, wherein in step d, the matching point set is optimized using RANSAC algorithm, the "outliers" are removed, and the "interior points" are reserved, which mainly comprises the steps of:
selecting a minimum data set from which the model can be estimated;
All data are brought into the model, the number of 'interior points' is calculated, the number of 'interior points' of the current model and the best model which is pushed out before are compared, the model parameter of the maximum 'interior point' number and the number of 'interior points' are recorded, and the steps are repeated until iteration is finished.
8. A high precision registration method for complex PCBs as defined in claim 1, wherein step e comprises:
and calculating the projected vertex matrix through the homography matrix and the vertex matrix at four corners of the subgraph of the original image area, and then carrying out projection transformation on each subgraph through the projection matrix of each subgraph.
9. A high precision registration method for complex PCBs according to claim 1, further comprising step f after step e,
And f, splicing all sub-graphs subjected to projection transformation to form a large graph matched with high precision.
CN202311239069.5A 2023-09-22 2023-09-22 High-precision registration method for complex PCB Active CN117173225B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311239069.5A CN117173225B (en) 2023-09-22 2023-09-22 High-precision registration method for complex PCB

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311239069.5A CN117173225B (en) 2023-09-22 2023-09-22 High-precision registration method for complex PCB

Publications (2)

Publication Number Publication Date
CN117173225A CN117173225A (en) 2023-12-05
CN117173225B true CN117173225B (en) 2025-07-22

Family

ID=88945009

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311239069.5A Active CN117173225B (en) 2023-09-22 2023-09-22 High-precision registration method for complex PCB

Country Status (1)

Country Link
CN (1) CN117173225B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117094983B (en) * 2023-09-22 2025-09-16 浙江师范大学 Machine vision image array detection method
CN118134976B (en) * 2024-01-18 2025-04-01 北京航天飞行控制中心 A method, system, device and medium for matching planetary surface images at different viewing angles
CN119374532B (en) * 2024-10-24 2025-10-31 河南秦尉数字技术有限公司 PCB jointed board inspection method and computing equipment
CN119919409B (en) * 2025-04-02 2025-05-30 西安捷航电子科技有限公司 PCB welding defect rapid identification method based on YOLO

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824289A (en) * 2014-02-17 2014-05-28 哈尔滨工业大学 Template-based array image registration method in snapshot spectral imaging
CN107194959A (en) * 2017-04-25 2017-09-22 北京海致网聚信息技术有限公司 The method and apparatus that image registration is carried out based on section

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105261022B (en) * 2015-10-19 2018-12-18 广州视源电子科技股份有限公司 PCB board matching method and device based on outer contour
CN108537833B (en) * 2018-04-18 2022-06-21 昆明物理研究所 Infrared image rapid splicing method
CN113298701B (en) * 2021-06-22 2022-04-26 北京航空航天大学 An integrated imaging 3D image registration method for multi-screen stitching
CN116433666B (en) * 2023-06-14 2023-08-15 江西萤火虫微电子科技有限公司 Method, system, electronic device and storage medium for online defect identification of board circuit

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824289A (en) * 2014-02-17 2014-05-28 哈尔滨工业大学 Template-based array image registration method in snapshot spectral imaging
CN107194959A (en) * 2017-04-25 2017-09-22 北京海致网聚信息技术有限公司 The method and apparatus that image registration is carried out based on section

Also Published As

Publication number Publication date
CN117173225A (en) 2023-12-05

Similar Documents

Publication Publication Date Title
CN117173225B (en) High-precision registration method for complex PCB
CN114331951B (en) Image detection method, image detection device, computer, readable storage medium, and program product
CN109615611B (en) A detection method for self-explosion defects of insulators based on inspection images
JP5538868B2 (en) Image processing apparatus, image processing method and program
CN109784250B (en) Positioning method and device of automatic guide trolley
US20080232715A1 (en) Image processing apparatus
CN114387515A (en) Cutting path planning method and device based on machine vision
CN110473221B (en) Automatic target object scanning system and method
CN114049380B (en) Target object positioning and tracking method, device, computer equipment and storage medium
CN106709500B (en) Image feature matching method
CN110765992A (en) Seal identification method, medium, equipment and device
CN109948521B (en) Image deviation rectifying method and device, equipment and storage medium
CN111161348B (en) Object pose estimation method, device and equipment based on monocular camera
CN111222507A (en) Automatic identification method of digital meter reading, computer readable storage medium
CN111598076B (en) Method and device for detecting and processing date in label image
CN111630569B (en) Binocular matching method, visual imaging device and device with storage function
CN110288040B (en) Image similarity judging method and device based on topology verification
CN107680035B (en) Parameter calibration method and device, server and readable storage medium
CN114299109A (en) Multi-target object trajectory generation method, system, electronic device and storage medium
CN108960247B (en) Image significance detection method and device and electronic equipment
CN114626118A (en) Building indoor model generation method and device
CN115984381A (en) Method and device for determining pose of circular guideboard, electronic equipment and medium
CN113228105A (en) Image processing method and device and electronic equipment
US9305235B1 (en) System and method for identifying and locating instances of a shape under large variations in linear degrees of freedom and/or stroke widths
CN112818983B (en) Method for judging character inversion by using picture acquaintance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant