CN110826571B - Image traversal algorithm for rapid image identification and feature matching - Google Patents
Image traversal algorithm for rapid image identification and feature matching Download PDFInfo
- Publication number
- CN110826571B CN110826571B CN201911059631.XA CN201911059631A CN110826571B CN 110826571 B CN110826571 B CN 110826571B CN 201911059631 A CN201911059631 A CN 201911059631A CN 110826571 B CN110826571 B CN 110826571B
- Authority
- CN
- China
- Prior art keywords
- image
- rgb
- traversing
- coordinate
- correspond
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/28—Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The application discloses an image traversal algorithm for quick image identification and feature matching, which comprises the following steps: the computer obtains the coordinates of each pixel point on the image and the RGB value of the corresponding color, converts the gray level of the color image into a binary gray level image, starts to identify from the upper left corner of the image, and finds out the pixel point corresponding to a certain pixel point of the characteristic graph as an identification point. Selecting the identification points as traversal coordinate origins, and carrying out regional hierarchical design on the image; dividing the image into a plurality of areas by taking the traversing coordinate origin as the center, traversing the image from a certain area in a clockwise direction, wherein the traversing coordinate points of each layer of the image are the coordinate point sets of the same level of each area, and searching other pixel points of the feature graph. The application starts to search for a new identification point from the coordinate position close to the last identification point, and greatly improves the efficiency of image identification and feature matching.
Description
Technical Field
The application relates to the technical field of image recognition, in particular to an image traversal algorithm for quick image recognition and feature matching.
Background
The traditional image traversing algorithm is similar to matrix traversing, namely, traversing each pixel point by taking the first pixel point at the top left corner as the origin of coordinates until the pixel point at the bottom right corner is finished, which is a feasible brute force algorithm and can solve the problem, but in experiments, each time the image traversing algorithm starts from the top left corner, the characteristic points can be found only by traversing each pixel point one by one, and the efficiency is very low.
Disclosure of Invention
The application provides an image traversal algorithm for rapid image identification and feature matching, which is used for improving the efficiency of image identification and feature matching.
In order to solve the technical problems, the embodiment of the application discloses the following technical scheme:
the embodiment of the application discloses an image traversal algorithm for quick image identification and feature matching, which comprises the following steps: the computer obtains the coordinates of each pixel point on the image and the RGB value of the corresponding color;
converting the gray level of the color image into a binary gray level image;
identifying from the upper left corner of the image, and finding out a pixel point corresponding to a certain pixel point of the characteristic graph as an identification point;
selecting the identification points as traversal coordinate origins, and carrying out regional hierarchical design on the image;
dividing the image into a plurality of areas by taking the traversing coordinate origin as the center, traversing the image from a certain area in a clockwise direction, wherein the traversing coordinate points of each layer of the image are the coordinate point sets of the same level of each area, and searching other pixel points of the feature graph.
Optionally, the dividing the image into a plurality of areas with the traversing coordinate origin as a center, traversing clockwise from a certain area, includes: the image is divided into four areas by taking the traversing coordinate origin as the center, wherein the four areas comprise an upper right area, a lower left area and an upper left area, and traversing is performed in a clockwise direction from the upper right area.
Optionally, the image traversing algorithm further includes: the image is acquired using a camera and transmitted to a computer.
Optionally, the image traversing algorithm further includes: the camera is scaled with the computer screen using a resolution of 640 x 480.
Optionally, the image is an 8 by 6 pixel size image.
Optionally, the converting the gray level of the color image into a binary gray level image uses the formula: (color- > red (). 77+color- > green (). 151+color- > blue (). 28) > 8.
Compared with the prior art, the application has the beneficial effects that:
the application provides an image traversal algorithm for quick image identification and feature matching, which comprises the following steps: the computer obtains the coordinates of each pixel point on the image and the RGB value of the corresponding color, converts the gray level of the color image into a binary gray level image, starts to identify from the upper left corner of the image, and finds out the pixel point corresponding to a certain pixel point of the characteristic graph as an identification point. Selecting the identification points as traversal coordinate origins, and carrying out regional hierarchical design on the image; dividing the image into a plurality of areas by taking the traversing coordinate origin as the center, traversing the image from a certain area in a clockwise direction, wherein the traversing coordinate points of each layer of the image are the coordinate point sets of the same level of each area, and searching other pixel points of the feature graph. The application designs and realizes an image traversing algorithm for quick image recognition and feature matching based on image recognition, bayesian decision theory, C++ programming language, linear discriminant function, matrix traversal, gray calculation and other methods, and searches new recognition points from coordinates close to the last time, thereby greatly improving the efficiency of image recognition and feature matching, and having the remarkable advantages of quick image traversing speed, high image recognition efficiency, accurate feature matching, high reliability and the like.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic diagram of coordinate arrangement of each pixel point of an image according to an embodiment of the present application;
fig. 2 is a schematic diagram illustrating an upper right area algorithm provided in an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a lower right area algorithm provided in an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a lower left area algorithm provided in an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating an upper left region algorithm provided in an embodiment of the present application;
FIG. 6 is a color arrangement diagram of each pixel point of an image according to an embodiment of the present application;
FIG. 7 is a test chart of an image traversal algorithm according to an embodiment of the present application;
fig. 8 is a feature pattern for testing according to an embodiment of the present application.
Detailed Description
In order to make the technical solution of the present application better understood by those skilled in the art, the technical solution of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
Acquiring an image by using a camera, and transmitting the image to a computer; the computer obtains the coordinates of each pixel point on the image and the RGB value of the corresponding color; identifying from the upper left corner of the image, and finding out a pixel point corresponding to a certain pixel point of the characteristic graph as an identification point; selecting the identification points as traversal coordinate origins, and carrying out regional hierarchical design on the image; dividing the image into a plurality of areas by taking the traversing coordinate origin as the center, traversing the image from a certain area in a clockwise direction, wherein the traversing coordinate points of each layer of the image are the coordinate point sets of the same level of each area, and searching other pixel points of the feature graph.
Further, the image is divided into four areas with the traversing coordinate origin as a center, wherein the four areas comprise an upper right area, a lower left area and an upper left area, and traversing is performed in a clockwise direction from the upper right area.
Further, the camera is scaled with the computer screen using a resolution of 640 x 480.
Further, the method further comprises the following steps: converting the gray level of the color image into a binary gray level image, and using the formula:
(color- > red () ×77+color- > green () ×151+color- > blue () > 28) >8, converting the color picture into a binary gray image for image recognition and image analysis
The image traversing algorithm provided by the application is tested as follows: the pixel coordinates of the image are arranged as shown in fig. 1.
Acquiring an image by using a camera, and transmitting the image to a computer; and the computer acquires the coordinates of each pixel point on the image and the RGB value of the corresponding color. Processing threads and coordinate mapping by a camera: the camera and the computer screen are scaled in the coordinate mapping, and the resolution is 640 x 480. And during characteristic pattern recognition, the camera is used for collecting picture information and transmitting the picture information to a computer. The processing mode of the camera thread is that the resolution ratio of 640 x 480 is automatically adapted when the camera is processed to be opened, so that the coordinate mapping of the image is facilitated, and the shot image is processed into the image with the resolution ratio of corresponding multiple.
And acquiring coordinates of each pixel point on the image and corresponding color RGB values. The image is obtained based on a picture with the size of 8 times 6 pixels, the coordinate arrangement of each pixel point on the picture and the corresponding color, namely RGB value (composed of three primary colors of red, green and blue) are derived from the content in the editing color in the Windows self-contained drawing tool, so that the selection is because the RGU is the same as the RGB in nature, and the drawing tool can conveniently obtain whether the output value is correct or not when comparing with the later algorithm test.
In order to facilitate debugging and testing in the algorithm writing process, the image is converted into an image with the size of 8 times 6 pixels, wherein the pixels are closely arranged.
Assuming that the coordinate position of a certain pixel point in the middle of the selected image is (i, j), the coordinate of the pixel point of the first layer in the upper left area is (i, j-1), then the coordinate of the second layer is (i, j-2), (i-1, j-1), and so on, and then the third layer and the fourth layer are arranged. Similarly, there is an upper right, a lower right, and a lower left. According to the method of partitioning and layering, all the blocks can be traversed orderly and efficiently, and the coordinates and color values of the searched points are displayed.
The method comprises the steps of traversing by taking coordinates of a certain pixel point as an origin, wherein at most four areas need to be traversed, and clockwise traversing sequences are an upper right area, a lower left area and an upper left area.
Assuming that the image is image, image. Getwidth () -1 is the maximum image size, image. Getheight () -1 is the maximum image size, number is the number of layers, and maxRect is the side length of the maximum square of the current region.
(1) The upper right region algorithm derivation procedure is shown as the thick line box region in fig. 2:
transverse row: the ordinate is larger than the picture width minus the abscissa (i.e. j > image. Width () -i), the maximum square side length is equal to the magnitude of the value of the picture width minus the abscissa (i.e. maxRect = image. Width () -i), the number of layers is equal to the magnitude of the value of the ordinate (i.e. number = j);
vertical row: the ordinate is smaller than the picture width minus the abscissa (i.e. j < image. Width () -i), the maximum square side length is equal to the size of the value of the ordinate (i.e. maxrect=j), and the number of layers is equal to the size of the value of the picture width minus the abscissa (i.e. number=image. Width () -i).
(2) The lower right region algorithm derivation procedure is shown as the thick line box region in fig. 3:
transverse row: the picture width minus the abscissa minus 1 is less than or equal to the picture height minus the ordinate (i.e. image. Width () -i-1< = image. Height () -j), the maximum square side length is equal to the size of the value of the picture width minus the abscissa minus 1 (i.e. maxRect = image. Width () -i-1), the number of layers is equal to the size of the value of the picture height minus the ordinate (i.e. number = image. Height () -j);
vertical row: the picture width minus the abscissa is larger than the picture height minus the ordinate (i.e., image width () -i-1> image height () -j), the maximum square length is equal to the size of the value of the picture height minus the ordinate (i.e., maxRect = image height () -j), and the number of layers is equal to the size of the value of the picture width minus the abscissa minus 1 (i.e., number = image width () -i-1).
(3) The lower left region algorithm derivation procedure is shown as the thick line box region in fig. 4:
transverse row: the abscissa is smaller than the picture height minus the ordinate minus 1 (i.e., i < image. Height () -j-1), the maximum square length is equal to the magnitude of the value of the abscissa plus 1 (i.e., maxrect=i+1), and the number of layers is equal to the magnitude of the value of the picture height minus the ordinate minus 1 (i.e., number=image. Height () -j-1);
vertical row: the abscissa is equal to or greater than the picture height minus the ordinate minus 1 (i.e., i > = image. Height () -j-1), the maximum square length is equal to the magnitude of the value of the picture height minus the ordinate minus 1 (i.e., maxRect = image. Height () -j-1), and the number of layers is equal to the magnitude of the value of the abscissa plus 1 (i.e., number = i+1).
The upper left region algorithm derivation procedure is shown as the thick line box region in fig. 5:
transverse row: the picture width minus the abscissa minus 1 is equal to or more than the picture height minus the ordinate (i.e., image. Width () -i-1> = image. Height () -j), the maximum square length is equal to the magnitude of the value of the abscissa (i.e., maxrect=i), and the number of layers is equal to the magnitude of the value of the ordinate plus 1 (i.e., number=j+1);
vertical row: the picture width minus the abscissa minus 1 is smaller than the picture height minus the ordinate (i.e., image width () -i-1< image height () -j), the maximum square length is equal to the size of the value of the ordinate plus 1 (i.e., maxrect=j+1), and the number of layers is equal to the size of the value of the abscissa (i.e., number=i).
Namely, the image traversal algorithm provided by the application is as follows:
upper right region: transverse row: j > image.width () -i, maxrect=image.width () -i, number=j; vertical row: j < image. Width () -i, maxrect=j, number=image. Width () -i.
Lower right area: transverse row: image-width () -i-1< = image-height () -j, maxract = image-width () -i-1, number = image-height () -j; vertical row: image/width () -i-1> image/height () -j, maxrect=image/height () -j, number=image/width () -i-1.
Lower left area: transverse row: i < image.height () -j-1, maxrect=i+1, number=image.height () -j-1; vertical row: i > = image.height () -j-1, maxrect = image.height () -j-1, number = i+1.
Upper left area: transverse row: image-width () -i-1> =image-height () -j, maxrect=i, number=j+1; vertical row: image, width () -i-1< image, height () -j, maxrect=j+1, number=i.
As in fig. 6: the pixel color values from left to right and from top to bottom are as follows:
a first row: RGB (255,128,128), RGB (255,255,128), RGB (128,255,128), RGB (0,255,128), RGB (128,255,255), RGB (0,128,255), RGB (255,128,192), RGB (255,128,255).
A second row: RGB (255, 0), RGB (255,255,0), RGB (128,255,0), RGB (0,255,64), RGB (0,255,255), RGB (0,128,192), RGB (128,128,192), RGB (255,0,128).
Third row: RGB (128,64,64), RGB (255,128,64), RGB (0,255,0), RGB (0,128,128), RGB (0,64,128), RGB (128,128,255), RGB (128,0,64), RGB (255,0,128).
Fourth row: RGB (128,0,0), RGB (255,128,0), RGB (0,128,0), RGB (0,0,255), RGB (0,0,255), RGB (0,0,160), RGB (128,0,128), RGB (128,0,255).
Fifth row: RGB (64, 0), RGB (128,64,0), RGB (0,64,0), RGB (0,64,64), RGB (0,0,128), RGB (0,0,64), RGB (64,0,64), RGB (64,0,128).
Sixth row: RGB (0, 0), RGB (128,128,0), RGB (128,128,64), RGB (128,128,128), RGB (64,128,128), RGB (192,192,192), RGB (64,0,64), RGB (255 ).
As shown in fig. 7, (3, 3) is selected as the origin of the coordinates to be traversed, and then the algorithm is used for traversing, and the traversing results are as follows:
the whole image is traversed from the selected coordinate points (3, 3) as traversing origins, the whole image is divided into four areas, the traversing is completed from the upper right corner to the upper left area in the clockwise direction, the traversed coordinate points of each layer of the whole image are coordinate point sets of the same level of each area, and the method specifically comprises the following steps:
(1) upper right region:
a first layer: the coordinate points (3, 2) correspond to the color values RGB (0,128,128);
a second layer: the coordinate points (3, 1) correspond to the color values RGB (0,255,64); the coordinate points (4, 1) correspond to the color values RGB (0,255,255); the coordinate points (4, 2) correspond to the color values RGB (0,64,128);
third layer: the coordinate points (3, 0) correspond to the color values RGB (0,255,128); the coordinate points (4, 0) correspond to the color values RGB (128,255,255); coordinate points (5, 0) correspond to color values RGB (0,128,255);
the coordinate points (5, 1) correspond to the color values RGB (0,128,192); the coordinate points (5, 2) correspond to the color values RGB (128,128,255);
fourth layer: the coordinate points (6, 0) correspond to the color values RGB (255,128,192); the coordinate points (6, 1) correspond to the color values RGB (128,128,192); the coordinate points (6, 2) correspond to the color values RGB (128,0,64);
fifth layer: the coordinate points (7, 0) correspond to the color values RGB (255,128,255); the coordinate points (7, 1) correspond to the color values RGB (255,0,255); the coordinate points (7, 2) correspond to the color values RGB (255,0,128);
(2) lower right area:
a first layer: the coordinate points (4, 3) correspond to the color values RGB (0,0,255);
a second layer: coordinate points (5, 3) correspond to color values RGB (0,0,160); coordinate points (5, 4) correspond to color values RGB (0,0,64); the coordinate points (4, 4) correspond to the color values RGB (0,0,128);
third layer: the coordinate points (6, 4) correspond to the color values RGB (64,0,64); coordinate points (6, 5) correspond to color values RGB (64,0,64); coordinate points (5, 5) correspond to color values RGB (192,192,192); coordinate points (4, 5) correspond to color values RGB (64,128,128);
fourth layer: the coordinate points (7, 3) correspond to the color values RGB (128,0,255); the coordinate points (7, 4) correspond to the color values RGB (64,0,128); the coordinate points (7, 5) correspond to the color values RGB (255 );
(3) lower left area:
a first layer: the coordinate points (3, 4) correspond to the color values RGB (0,64,64);
a second layer: the coordinate points (3, 5) correspond to the color values RGB (128,128,128); the coordinate points (2, 5) correspond to the color values RGB (128,128,64); the coordinate points (2, 4) correspond to the color values RGB (0,64,0);
third layer: the coordinate points (1, 5) correspond to the color values RGB (128,128,0); the coordinate points (1, 4) correspond to the color values RGB (128,64,0);
fourth layer: the coordinate points (0, 5) correspond to the color values RGB (0, 0); the coordinate points (0, 4) correspond to the color values RGB (64, 0);
(4) upper left region:
a first layer: the coordinate points (2, 3) correspond to the color values RGB (0,128,0);
a second layer: the coordinate points (1, 3) correspond to the color values RGB (255,128,0); the coordinate points (1, 2) correspond to the color values RGB (255,128,64); the coordinate points (2, 2) correspond to the color values RGB (0,255,0);
third layer: coordinate points (0, 3) correspond to color values RGB (128,0,0); coordinate points (0, 2) correspond to color values RGB (128,64,64); coordinate point (0, 1) corresponds to color value RGB (255, 0) and coordinate point (1, 1) corresponds to color value RGB (255,255,0); the coordinate points (2, 1) correspond to the color values RGB (128,255,0);
fourth layer: the coordinate points (0, 0) correspond to the color values RGB (255,128,128); the coordinate points (1, 0) correspond to the color values RGB (255,255,128); the coordinate point (2, 0) corresponds to the color value RGB (128,255,128).
According to the traversing method provided by the application, the whole hierarchical traversal is that corresponding hierarchies of each region are orderly assembled together in the clockwise direction, and the result is as follows:
a first layer: upper right first layer + lower left first layer + upper left first layer
The coordinate points (3, 2) correspond to the color values RGB (0,128,128); the coordinate points (4, 3) correspond to the color values RGB (0,0,255); the coordinate points (3, 4) correspond to the color values RGB (0,64,64); the coordinate points (2, 3) correspond to the color values RGB (0,128,0);
a second layer: upper right second layer + lower left second layer + upper left second layer
The coordinate points (3, 1) correspond to the color values RGB (0,255,64); the coordinate points (4, 1) correspond to the color values RGB (0,255,255); the coordinate points (4, 2) correspond to the color values RGB (0,64,128); coordinate points (5, 3) correspond to color values RGB (0,0,160); coordinate points (5, 4) correspond to color values RGB (0,0,64); the coordinate points (4, 4) correspond to the color values RGB (0,0,128); the coordinate points (3, 5) correspond to the color values RGB (128,128,128); the coordinate points (2, 5) correspond to the color values RGB (128,128,64); the coordinate points (2, 4) correspond to the color values RGB (0,64,0); the coordinate points (1, 3) correspond to the color values RGB (255,128,0); the coordinate points (1, 2) correspond to the color values RGB (255,128,64); the coordinate points (2, 2) correspond to the color values RGB (0,255,0);
third layer: upper right third layer, lower left third layer, upper left third layer
The coordinate points (3, 0) correspond to the color values RGB (0,255,128); the coordinate points (4, 0) correspond to the color values RGB (128,255,255); coordinate points (5, 0) correspond to color values RGB (0,128,255); the coordinate points (5, 1) correspond to the color values RGB (0,128,192); the coordinate points (5, 2) correspond to the color values RGB (128,128,255); the coordinate points (6, 3) correspond to the color values RGB (128,0,128); the coordinate points (6, 4) correspond to the color values RGB (64,0,64); coordinate points (6, 5) correspond to color values RGB (64,0,64); coordinate points (5, 5) correspond to color values RGB (192,192,192); coordinate points (4, 5) correspond to color values RGB (64,128,128); the coordinate points (1, 5) correspond to the color values RGB (128,128,0); the coordinate points (1, 4) correspond to the color values RGB (128,64,0); coordinate points (0, 3) correspond to color values RGB (128,0,0); coordinate points (0, 2) correspond to color values RGB (128,64,64); the coordinate points (0, 1) correspond to the color values RGB (255, 0); the coordinate points (1, 1) correspond to the color values RGB (255,255,0); the coordinate points (2, 1) correspond to the color values RGB (128,255,0);
fourth layer: upper right fourth layer, lower left fourth layer, upper left fourth layer
The coordinate points (6, 0) correspond to the color values RGB (255,128,192); the coordinate points (6, 1) correspond to the color values RGB (128,128,192); the coordinate points (6, 2) correspond to the color values RGB (128,0,64); the coordinate points (7, 3) correspond to the color values RGB (128,0,255); the coordinate points (7, 4) correspond to the color values RGB (64,0,128); the coordinate points (7, 5) correspond to the color values RGB (255 ); the coordinate points (0, 5) correspond to the color values RGB (0, 0); the coordinate points (0, 4) correspond to the color values RGB (64, 0); the coordinate points (0, 0) correspond to the color values RGB (255,128,128); the coordinate points (1, 0) correspond to the color values RGB (255,255,128); the coordinate point (2, 0) corresponds to the color value RGB (128,255,128).
Fifth layer: upper right fifth layer
The coordinate points (7, 0) correspond to the color values RGB (255,128,255); the coordinate points (7, 1) correspond to the color values RGB (255,0,255); the coordinate points (7, 2) correspond to the color values RGB (255,0,128).
The image traversing algorithm provided by the application is tested as follows:
the feature pattern design is shown in fig. 8: the black-white concentric circle comprises three concentric circles with alternating black and white, wherein the concentric circle with the smallest radius is white, the concentric circle in the middle is black, and the concentric circle with the largest radius is white. The concentric circle traversing rule is consistent with the original image, one reason for selecting concentric circles as the characteristic pattern shape and designing the colors of the concentric circles as white, black and white is that black and white are the result of image binarization, the step of image binarization is directly omitted, the identification step is simplified, and the other reason is that the shape of the identification pattern is not needed to be seen if the colors are consistent by the traversing algorithm designed before, and the shape of the circle is better traversed and identified compared with other shapes.
The using formula for converting the gray level of the color image into the binary gray level image is as follows: (color- > red (). 77+color- > green (). 151+color- > blue (). 28) > 8.
Traversing identification and matching of feature patterns: after the camera is successful in finding the identification points for the first time, traversing each area on the screen layer by taking the identification points as the original points, and performing the next cycle until all the identification points are found. The camera captures the identification point, starting from the upper left corner, searching the identification point for the first time when finding the concentric circle (white) with the smallest radius, traversing each area on the screen layer by taking the identification point as the origin, continuously searching whether the concentric circle (black) with the larger radius exists near the white point or not until finding the second concentric circle, searching the largest concentric circle (white) by the same method, returning the radius of the smallest concentric circle if the three concentric circles are successfully matched, and searching the identification point successfully. In the identification process, the following two operations are needed to be carried out, so that the image identification and matching efficiency is improved.
In the identification process, the following two operations are needed to be carried out, so that the image identification and matching efficiency is improved.
(1) And carrying out interval estimation on the data of the selected image, and estimating a maximum probability hit area so as to save resources and plan to use maximum likelihood estimation and point estimation.
(2) The statistical pattern recognition is performed on each point in the image (samples with similarity approach each other in a pattern space and form a group, namely a group, the analysis method is that a given pattern is classified into various omega 1, omega 2, … and omega c according to a characteristic vector xi= (Xi 1, xi2, …, xid) T (i=1, 2, …, N) measured by the pattern, and then classification is judged according to a distance function between the patterns, wherein T represents transposition, N is the number of sample points, and d is the characteristic number of the sample.
In summary, the present application provides an image traversal algorithm for rapid image recognition and feature matching, including: acquiring an image by using a camera, and transmitting the image to a computer; the computer obtains the coordinates of each pixel point on the image and the RGB value of the corresponding color, converts the gray level of the color image into a binary gray level image, starts to identify from the upper left corner of the image, and finds out the pixel point corresponding to a certain pixel point of the characteristic graph as an identification point. Selecting the identification points as traversal coordinate origins, and carrying out regional hierarchical design on the image; dividing the image into a plurality of areas by taking the traversing coordinate origin as the center, traversing the image from a certain area in a clockwise direction, wherein the traversing coordinate points of each layer of the image are the coordinate point sets of the same level of each area, and searching other pixel points of the feature graph. The application designs and realizes an image traversing algorithm for quick image recognition and feature matching based on image recognition, bayesian decision theory, C++ programming language, linear discriminant function, matrix traversal, gray calculation and other methods, and searches new recognition points from coordinates close to the last time, thereby greatly improving the efficiency of image recognition and feature matching, and having the remarkable advantages of quick image traversing speed, high image recognition efficiency, accurate feature matching, high reliability and the like.
Since the foregoing embodiments are all described in other modes by reference to the above, the same parts are provided between different embodiments, and the same and similar parts are provided between the embodiments in the present specification. And will not be described in detail herein.
It should be noted that in this specification, terms such as "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a circuit structure, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such circuit structure, article, or apparatus.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure of the application herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
The embodiments of the present application described above do not limit the scope of the present application.
Claims (6)
1. An image traversal algorithm for rapid image recognition and feature matching, comprising:
the computer obtains the coordinates of each pixel point on the image and the RGB value of the corresponding color;
identifying from the upper left corner of the image, and finding out a pixel point corresponding to a certain pixel point of the characteristic graph as an identification point;
selecting the identification points as traversal coordinate origins, and carrying out regional hierarchical design on the image;
dividing the image into a plurality of areas by taking the traversing coordinate origin as the center, traversing the image from a certain area in a clockwise direction, wherein the traversing coordinate points of each layer of the image are the coordinate point sets of the same level of each area, and searching other pixel points of the feature graph.
2. The image traversal algorithm according to claim 1, wherein said dividing the image into a plurality of regions centered on the traversal origin of coordinates, traversing clockwise from a region, comprises: the image is divided into four areas by taking the traversing coordinate origin as the center, wherein the four areas comprise an upper right area, a lower left area and an upper left area, and traversing is performed in a clockwise direction from the upper right area.
3. The image traversal algorithm according to claim 1, further comprising: the image is acquired using a camera and transmitted to a computer.
4. An image traversal algorithm according to claim 3, further comprising: the camera is scaled with the computer screen using a resolution of 640 x 480.
5. The image traversal algorithm according to claim 1, further comprising: the image is an 8 by 6 pixel size image.
6. The image traversal algorithm according to claim 1, further comprising: the gray formula of the color image is as follows:
。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911059631.XA CN110826571B (en) | 2019-11-01 | 2019-11-01 | Image traversal algorithm for rapid image identification and feature matching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911059631.XA CN110826571B (en) | 2019-11-01 | 2019-11-01 | Image traversal algorithm for rapid image identification and feature matching |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110826571A CN110826571A (en) | 2020-02-21 |
CN110826571B true CN110826571B (en) | 2023-10-20 |
Family
ID=69551935
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911059631.XA Active CN110826571B (en) | 2019-11-01 | 2019-11-01 | Image traversal algorithm for rapid image identification and feature matching |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110826571B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111582290B (en) * | 2020-05-13 | 2023-04-07 | 郑州轻工业大学 | Computer image recognition method |
CN112085030B (en) * | 2020-09-09 | 2024-08-09 | 重庆广播电视大学重庆工商职业学院 | Similar image determining method and device |
CN113533158B (en) * | 2021-07-06 | 2022-07-19 | 中国地质大学(北京) | Coal reservoir pore structure parameter quantitative analysis method based on SEM image |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101499171A (en) * | 2009-02-13 | 2009-08-05 | 上海海事大学 | Video processing oriented fast target partition and identification method |
CN102867181A (en) * | 2012-07-27 | 2013-01-09 | 华南理工大学 | Characteristic extraction module for digital image processing and traversing method |
CN103345743A (en) * | 2013-06-18 | 2013-10-09 | 宁波成电泰克电子信息技术发展有限公司 | Image segmentation method for intelligent flaw detection of cell tail end |
CN104484659A (en) * | 2014-12-30 | 2015-04-01 | 南京巨鲨显示科技有限公司 | Method for automatically identifying and calibrating medical color images and medical gray scale images |
CN106875405A (en) * | 2017-01-19 | 2017-06-20 | 浙江大学 | CT image pulmonary parenchyma template tracheae removing methods based on BFS |
CN108564056A (en) * | 2018-04-25 | 2018-09-21 | 中国水利水电科学研究院 | A kind of method of remote sensing image identifying water boy extraction |
CN109118528A (en) * | 2018-07-24 | 2019-01-01 | 西安工程大学 | Singular value decomposition image matching algorithm based on area dividing |
CN109741394A (en) * | 2018-12-10 | 2019-05-10 | 北京拓尔思信息技术股份有限公司 | Image processing method, device, electronic equipment and storage medium |
CN109918974A (en) * | 2017-12-13 | 2019-06-21 | 南京机器人研究院有限公司 | A kind of robot target recognition methods |
CN110348263A (en) * | 2019-06-24 | 2019-10-18 | 西安理工大学 | A kind of two-dimensional random code image recognition and extracting method based on image recognition |
-
2019
- 2019-11-01 CN CN201911059631.XA patent/CN110826571B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101499171A (en) * | 2009-02-13 | 2009-08-05 | 上海海事大学 | Video processing oriented fast target partition and identification method |
CN102867181A (en) * | 2012-07-27 | 2013-01-09 | 华南理工大学 | Characteristic extraction module for digital image processing and traversing method |
CN103345743A (en) * | 2013-06-18 | 2013-10-09 | 宁波成电泰克电子信息技术发展有限公司 | Image segmentation method for intelligent flaw detection of cell tail end |
CN104484659A (en) * | 2014-12-30 | 2015-04-01 | 南京巨鲨显示科技有限公司 | Method for automatically identifying and calibrating medical color images and medical gray scale images |
CN106875405A (en) * | 2017-01-19 | 2017-06-20 | 浙江大学 | CT image pulmonary parenchyma template tracheae removing methods based on BFS |
CN109918974A (en) * | 2017-12-13 | 2019-06-21 | 南京机器人研究院有限公司 | A kind of robot target recognition methods |
CN108564056A (en) * | 2018-04-25 | 2018-09-21 | 中国水利水电科学研究院 | A kind of method of remote sensing image identifying water boy extraction |
CN109118528A (en) * | 2018-07-24 | 2019-01-01 | 西安工程大学 | Singular value decomposition image matching algorithm based on area dividing |
CN109741394A (en) * | 2018-12-10 | 2019-05-10 | 北京拓尔思信息技术股份有限公司 | Image processing method, device, electronic equipment and storage medium |
CN110348263A (en) * | 2019-06-24 | 2019-10-18 | 西安理工大学 | A kind of two-dimensional random code image recognition and extracting method based on image recognition |
Non-Patent Citations (2)
Title |
---|
刘遵仁.图的遍历.《数据结构》.2018, * |
遍历式九宫格图像搜索法及其应用;董莹莹;《数据理论与应用》;107-113 * |
Also Published As
Publication number | Publication date |
---|---|
CN110826571A (en) | 2020-02-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111179251B (en) | Defect detection system and method using template comparison based on Siamese neural network | |
CN112348787B (en) | Training method of object defect detection model, object defect detection method and device | |
CN108562589B (en) | Method for detecting surface defects of magnetic circuit material | |
CN108520514B (en) | Consistency detection method of printed circuit board electronic components based on computer vision | |
CN110826571B (en) | Image traversal algorithm for rapid image identification and feature matching | |
US20020102018A1 (en) | System and method for color characterization using fuzzy pixel classification with application in color matching and color match location | |
CN105913093A (en) | Template matching method for character recognizing and processing | |
CN116630301A (en) | Strip steel surface small target defect detection method and system based on super resolution and YOLOv8 | |
WO2017101225A1 (en) | Trademark graph element identification method, apparatus and system, and computer storage medium | |
JP2004295879A (en) | Defect classification method | |
WO2021228194A1 (en) | Cable detection method, robot and storage device | |
CN110462634A (en) | Mark Detection video analysis method | |
CN110569774A (en) | An Automatic Digitization Method of Line Chart Image Based on Image Processing and Pattern Recognition | |
CN111160374A (en) | A color recognition method, system and device based on machine learning | |
CN103226823A (en) | Fast image registering method based on LSPT (Logarithmic Subtraction Point Template) | |
Tian et al. | Corrosion identification of fittings based on computer vision | |
CN112750113B (en) | Glass bottle defect detection method and device based on deep learning and linear detection | |
CN115131619A (en) | Extra-high voltage part sorting method and system based on point cloud and image fusion | |
Bremananth et al. | Wood species recognition using GLCM and correlation | |
JP4909479B2 (en) | System and method for locating regions of matching color and pattern in a target image | |
CN118071831B (en) | Image coarse positioning method, device and computer readable storage medium | |
CN117576028A (en) | Image identification method based on magnetic resonance imaging multidimensional feature detection | |
Putri et al. | Color and texture features extraction on content-based image retrieval | |
Madnur et al. | Advancing in cricket analytics: novel approaches for pitch and ball detection employing OpenCV and YOLOv8 | |
CN112734767B (en) | Extraction method, device, equipment and medium based on pathological image tissue region |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |