[go: up one dir, main page]

CN111815720B - Image processing method and device, readable storage medium and electronic equipment - Google Patents

Image processing method and device, readable storage medium and electronic equipment Download PDF

Info

Publication number
CN111815720B
CN111815720B CN201910295510.9A CN201910295510A CN111815720B CN 111815720 B CN111815720 B CN 111815720B CN 201910295510 A CN201910295510 A CN 201910295510A CN 111815720 B CN111815720 B CN 111815720B
Authority
CN
China
Prior art keywords
image
pixel
distance value
determining
connected region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910295510.9A
Other languages
Chinese (zh)
Other versions
CN111815720A (en
Inventor
周兰
许译天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Horizon Robotics Technology Research and Development Co Ltd
Original Assignee
Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Horizon Robotics Technology Research and Development Co Ltd filed Critical Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority to CN201910295510.9A priority Critical patent/CN111815720B/en
Publication of CN111815720A publication Critical patent/CN111815720A/en
Application granted granted Critical
Publication of CN111815720B publication Critical patent/CN111815720B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Color Image Communication Systems (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an image processing method, an image processing device, a readable storage medium and electronic equipment, wherein the method comprises the steps of obtaining an RGB image, wherein a shooting object contained in the RGB image is a test card; determining at least one connected region of the RGB image; determining a distance value set in each connected region according to each connected region to obtain a distance value set corresponding to each connected region; and determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one communication area, so as to correct the RGB image according to the color difference. The color difference of the RGB image is determined according to the distance value set by determining the distance value set of the connected region in the RGB image, so that the purpose of quantifying the color difference of the image, avoiding subjective judgment of the color difference and improving the color difference measurement precision is realized.

Description

Image processing method and device, readable storage medium and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a readable storage medium, and an electronic device.
Background
Chromatic aberration (Chromatic Aberration, CA) is an optical phenomenon that results from a lens having different refractive indices for light of different wavelengths, and thus failing to focus light of all colors to the same convergence point. The chromatic aberration can be divided into axial chromatic aberration (Axial Aberration or Longitudinal Aberration) and lateral chromatic aberration (Lateral Chromatic Aberration). In the prior art, adjusting lenses or replacing high quality, expensive lenses to eliminate chromatic aberration at the lens is often used, and such methods are costly and limit the wide application of such methods or devices in cameras. In addition, the person skilled in the art determines the degree of chromatic aberration by imaging the image edge, and adjusts the lens to eliminate chromatic aberration, so that the determination of the degree of chromatic aberration is subjective and has small accuracy, and the adjustment of the lens or the selection of the specification and model of the lens is limited to a certain extent.
Disclosure of Invention
In order to solve the technical problems, the application provides an image processing method, an image processing device, a readable storage medium and electronic equipment.
According to an aspect of the present application, there is provided an image processing method including: acquiring an RGB image, wherein a shooting object contained in the RGB image is a test card; determining at least one connected region of the RGB image; determining a distance value set in each connected region according to each connected region to obtain a distance value set corresponding to each connected region; and determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one communication area, so as to correct the RGB image according to the color difference.
According to another aspect of the present application, there is provided an image processing apparatus comprising: the image acquisition module is used for acquiring RGB images, wherein the shooting objects contained in the RGB images are test cards; a first determining module for determining at least one connected region of the RGB image; the second determining module is used for determining a distance value set in each connected region according to each connected region so as to obtain a distance value set corresponding to each connected region; and a third determining module, configured to determine a color difference of the RGB image according to the distance value sets corresponding to the at least one connected region, so as to correct the RGB image according to the color difference.
According to another aspect of the present application, there is provided a computer readable storage medium storing a computer program for performing any one of the methods described above.
According to another aspect of the present application, there is provided an electronic device including: a processor; a memory for storing the processor-executable instructions; the processor is configured to perform any of the methods described above.
The embodiment of the application provides an image processing method, which determines the chromatic aberration of an RGB image according to a distance value set of a connected region in the RGB image by determining the distance value set, so as to realize the measurement quantification of the chromatic aberration of the image, avoid the subjective judgment of the chromatic aberration and improve the measurement precision of the chromatic aberration.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing embodiments of the present application in more detail with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1 is a schematic view of an application scenario of the present application.
Fig. 2 is a block diagram of an exemplary test card of the present application.
Fig. 3 is a flowchart of an image processing method according to a first exemplary embodiment of the present application.
Fig. 4 is a flowchart of an image processing method according to a second exemplary embodiment of the present application.
Fig. 5 is a flowchart of an image processing method according to a third exemplary embodiment of the present application.
Fig. 6 is a schematic diagram of a distance value set in a first exemplary embodiment of the present application.
Fig. 7 is a schematic structural view of an image processing apparatus provided in the first exemplary embodiment of the present application.
Fig. 8 is a schematic structural view of an image processing apparatus provided in a second exemplary embodiment of the present application.
Fig. 9 is a schematic structural view of an image processing apparatus provided in a third exemplary embodiment of the present application.
Fig. 10 is a block diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
Hereinafter, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein.
Chromatic aberration (Chromatic aberration; chromatic aberration), also known as chromatic aberration, is an optical phenomenon that results from the inability of a lens to focus light of all colors to the same convergence point due to its different refractive indices for light of different wavelengths.
Fig. 1 is a schematic view of an application scenario of the present application, and fig. 2 is a schematic view of a test card of the present application. First, as shown in fig. 2, the test card provided by the present application may be a black rectangular plate 20 on which a plurality of equally spaced through holes 21 are provided. In combination with the application scenario diagram shown in fig. 1, the device comprises a shooting fixing support 1, a white light source 2 arranged at the bottom of the fixing support 1, a test card 20 shown in fig. 2 arranged in the middle of the fixing support 1, and an image acquisition device 3 (for example, a camera) arranged at the upper part of the fixing support 1. The white light source 1 is turned on, the light source penetrates through the through hole 21 of the test card 20, and the image acquisition device 3 acquires an image of the test card 20 at this time, and a color difference phenomenon can be observed through the image.
The application provides an image processing method for measuring the chromatic aberration of an image acquired by an image acquisition device, so that the image is corrected according to the chromatic aberration, and the image quality is improved. The image processing method of the present application will be described in detail with reference to the accompanying drawings, so that those skilled in the art can clearly and accurately understand the technical scheme and technical effects of the present application.
Fig. 3 is a flowchart of an image processing method according to a first exemplary embodiment of the present application. The embodiment can be applied to an electronic device, as shown in fig. 3, and includes the following steps:
step 301, an RGB image is obtained, wherein a shooting object included in the RGB image is a test card.
Based on the image acquired by the image acquisition device as in fig. 1, in this step, the acquired image can be acquired from the image acquisition device. In the present application, the image acquired in this step may be an RGB image.
Wherein the RGB image can be obtained by, for example: the method comprises the steps that 1, a white light source is used for irradiating one side of a test card, an image acquisition device shoots an image of the current test card, and an RGB image is obtained through separation after image processing; the scheme 2, the test card is respectively irradiated by the lights of three colors of red, green and blue, and the images of the test card are respectively collected, so that RGB images are obtained; in the scheme 3, filtering treatment is carried out on a white light source through 3 kinds of optical filters of red, green and blue, the filtered light source irradiates the test card, and then an image of the test card is acquired, so that an RGB image is obtained.
At step 302, at least one connected region of the RGB image is determined.
In this step, all connected areas are found out from the RGB image. Illustratively, all connected regions may be determined from the RGB image by a flood fill method (Floodfill) or a seed fill method.
In the present application, the number of the communication areas in the RGB image may be identical to the number of the through holes of the test card included in the RGB image. For example, the number of through holes included in the test card is the number of connected areas if the RGB image includes all the areas of the test card, and the number of through holes included in the partial area is the number of connected areas if the RGB image includes only the partial area of the test card. For example, assuming a test card has 150 through holes, when the RGB image includes all of the area of the test card, the RGB image has 150 connected areas, while if the RGB image includes a partial area of the test card, such as 40 through holes, the RGB image has 40 connected areas.
Step 303, for each connected region, determining a set of distance values in the connected region.
In the present application, the distance value set of each connected region may include a distance value of RG, a distance value of GB, and a distance value of the connected region to a center point of the RGB image.
Illustratively, the distance values between the R channel and the G channel, and the distance values between the G channel and the B channel are determined for the R, G and B three color channels in each communication region, respectively. Wherein, the distance value in the present application may be euclidean distance.
Step 304, determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one connected region, so as to correct the RGB image according to the color difference.
In this step, the color difference of the RGB image is determined based on the distance values determined in step 303, and the magnitude of the distance values contained in the distance value set may determine the degree of the color difference in the RGB image for any one of the connected areas, for example. For example, the distance value included in the distance value set is larger, and the color difference phenomenon in the RGB image is more remarkable.
In summary, according to the image processing method provided by the application, the color difference of the RGB image is determined according to the distance value set by determining the distance value set of the connected region in the RGB image, so that the purpose of quantifying the color difference of the image, avoiding the subjective judgment of the color difference and improving the color difference measurement precision is achieved.
Fig. 4 is a flowchart of an image processing method according to a second exemplary embodiment of the present application. On the basis of the embodiment shown in fig. 3, step 302 may include the following steps:
in step 3021, pixel values of all pixel points of any color channel image in an RGB image are determined.
The image is composed of a plurality of pixel points. The pixels are composed of tiles of the image, and the tiles are assigned values representing color. In the digitized image, the numerical value representing the color in the small square may be determined as a pixel value. For an RGB image, the pixel value may be a value of R, G, B corresponding to a pixel point, for example.
Step 3022, converting the color channel image into a binarized image based on the pixel values of all the pixel points.
In this step, based on the pixel values obtained in the above step 3021, the pixel value of each pixel point may be compared with a selected threshold value, so as to achieve the purpose of performing binarization conversion on the RGB image, so that the RGB image shows an obvious black-and-white effect after performing the binarization conversion.
Illustratively, converting an RGB image into a binarized image may be accomplished by: step 1, comparing an original pixel value of any pixel point with a preset pixel threshold value; step 2, if the original pixel value of any pixel point is smaller than the preset pixel threshold value, setting the pixel value of the pixel point as a first pixel value, or if the original pixel value of any pixel point is larger than the preset pixel threshold value, setting the pixel value of the pixel point as a second pixel value; and 3, determining an image formed by all the pixel points with the pixel values of the first pixel value and all the pixel points with the pixel values of the second pixel value as a binarized image. For example, assuming that the preset pixel threshold is a, the pixel value of any pixel point in the RGB image is M, if M > a, the pixel value of the pixel point is reset to 255, and if M < a, the pixel value of the pixel point is reset to 0 in the present application. Further, based on the above steps, after traversing all the pixels in the RGB image, the pixel values of the pixels in the RGB image are reset to 0 or 255, and the pixels after the pixel values are reset form a binarized image corresponding to the RGB image.
In the present application, the preset pixel threshold value may be determined by an adaptive threshold algorithm or a global threshold algorithm, or may be a value selected by one skilled in the art based on his experience. And are not limited herein.
Step 3023, traversing the pixels in the binarized image to determine a connected region for each color channel image.
In this step, for each pixel point in the binarized image, the connected region may be determined by, for example, a diffused water filling (Floodfill) method or a seed filling method. Taking a seed filling method as an example, searching points around a seed point, queuing points with the same value as the seed point as new seeds, and performing the same expansion operation on the newly queued seeds, so that the region with the same value as the original seeds is selected to form a connected region. In the present application, this process of expanding may be a depth-first search or a breadth-first search, and the present application is not limited thereto.
Based on the above description of the seed filling method, in the present application, the determination of the connected region can be exemplarily achieved by the following steps: step 1, selecting a reference pixel point from the binarized image to traverse all pixel points adjacent to the reference pixel point until the pixel value of a non-pixel point in the adjacent pixel points is the same as the pixel value of the reference pixel point; and 2, determining an image area formed by the reference pixel point and all adjacent pixel points with the same pixel value as the reference pixel point as a communication area so as to obtain the communication area of each color channel image.
For the sake of understanding, for example, it is assumed that a pixel point a is selected, if a connected domain of the pixel point a needs to be found, a pixel point B (the number of the pixel points B may be 0 or more than 1) around the pixel point a and the pixel value of the pixel point B is searched, and if the pixel point B is found, the pixel points a and the image region formed by the searched pixel points having the same pixel value as the pixel point a are all connected domains until the pixel point having the same pixel value as the found pixel point is found.
In step 3024, the connected areas of each color channel image are marked to obtain connected areas of the RGB image.
Based on the connected regions determined in the above step 3024, in this step, these connected regions are marked, thereby obtaining connected regions of the RGB image. For example, the connected areas at the corresponding same positions in the three color channel images may be marked with the same identification. In the present application, the region represented by each white hole (pixel value of 255) in the binarized image represents one connected region.
In this step, illustratively, for the image of any color channel of R, G, B, the position of each white hole (pixel value is 255) in the binarized image thereof is marked as 1, the area outside the white hole (pixel value is 0) is marked as 0, and then the positions of the white holes in the R, G, B three color channel images are in one-to-one correspondence through the marked marks, thereby obtaining the connected area of the RGB image.
Through the embodiment, the RGB image is converted into the binarized image, all the connected areas in the image can be determined more quickly and efficiently, the connected areas of each color channel image are marked to finally obtain the connected areas in the RGB image, the accuracy of the determination of the connected areas can be improved, and further the overall operation speed and accuracy are improved on the basis of the qualitative determination of chromatic aberration.
Fig. 5 is a flowchart of an image processing method according to a third exemplary embodiment of the present application. As shown in fig. 5, on the basis of the embodiment shown in fig. 3, step 303 may include the steps of:
Step 3031, determining coordinate values of the centroid of each color channel according to the coordinates of the pixel points and the number of the pixel points in each color channel in the communication area.
In this step, the number of pixels included in each color channel and the coordinates (for example, (X i,Yi)) of each pixel in each color channel can be determined for any one of the connected regions. In any one color channel in any one of the communication areas, the coordinate values of all the pixels are summed, for example, assuming that there are N pixels in total, x=x 1+X2+X3+﹍+XN,Y=Y1+Y2+Y3+﹍+YN, and further, the sum of the coordinate values of all the pixels is divided by the total number of pixels, so that the coordinates of the centroid of the color channel can be obtained, for example, the coordinates of the centroid are (X/N, Y/N).
Step 3032, determining a distance value of RG and a distance value of GB based on the coordinate values of the centroids of the RGB three color channels.
Based on the coordinate values of the centroids of the RGB three-color channels in any one of the communication areas obtained in the above steps, the distance value of RG and the distance value of GB are determined, for example, with the coordinate value of the R-color channel being (x r,yr), the coordinate value of the G-color channel being (x g,yg), and the coordinate value of the B-color channel being (x b,yb). According to an exemplary embodiment of the present application, the distance value of the RG and the distance value of the GB may be, but not limited to, a euclidean distance of the RG and a euclidean distance of the GB.
Step 3033, for each connected region, determining the coordinate value of the centroid of the connected region according to the coordinates of the pixel points in the connected region and the number of the pixel points.
In an exemplary embodiment of the present application, the implementation process of the step may refer to the implementation process of step 3031, which is only determined by selecting the coordinates of all the pixels in the connected area and the number of the pixels in the connected area. And will not be described in detail herein.
In other exemplary embodiments, since the coordinate values of the three color channels have been determined based on the pixel points of the three color channels, respectively, in the foregoing step 3031, the coordinate values of the centroids of the communication areas may be determined using the coordinate values of the three color channels in this step. Illustratively, the coordinates of the centroid of the connected region are set to (x R,yR), where x R=(xr+xg+xb)/3,yR=(yr+yg+yb)/3.
Step 3034, determining a distance value from the connected region to a center point of the RGB image based on the coordinate values of the centroid of the connected region.
Illustratively, assuming that the coordinates of the center point of the RGB image are (x c,yc), the distance value R of the connected region to the center point of the RGB image may be expressed as
To facilitate an understanding of the distance value set in the present application, the following description is made in connection with the diagram shown in fig. 6. As shown in fig. 6, the left area of the drawing is an image of the test card 20, and includes a plurality of communication areas (for example, corresponding to through holes 21 in the test card 20), any one of the communication areas a is selected, wherein three color channels are circles corresponding to R, g and b, Δrg and Δbg shown in the drawing are distance values of RG and GB, respectively, and R is a distance value from the communication area a to a center point of the RGB image.
The present application may further include another embodiment, based on the embodiment shown in fig. 5, in which a distance value from the connected region to the center point of the RGB image is input into the trained multi-layer feedforward neural network to output a distance value of RG and a distance value of GB corresponding to the distance value. In the application, a plurality of distance values from a connected region to a center point of an RGB image, RG distance values corresponding to the distance values, and GB distance values are used as training data, and a multi-layer feedforward neural network is repeatedly learned and trained to learn a mapping relation among the distance values from the connected region to the center point of the RGB image, RG distance values corresponding to the distance values, and GB distance values. Based on the trained multilayer feedforward neural network, when the distance value from the connected region to the central point of the RGB image is input, the distance value of RG and the distance value of GB corresponding to the distance value are output, and the color difference of the RGB image can be determined. By the exemplary embodiment, the color difference of the RGB image is determined through the neural network, so that the operation speed can be improved, the data processing capability can be improved, and the operation cost can be reduced.
The image processing method of the present application has been described in detail in the foregoing exemplary embodiments, and an image processing apparatus corresponding to the foregoing method embodiments will be further described below with reference to the accompanying drawings.
Fig. 7 is a schematic structural view of an image processing apparatus provided in the first exemplary embodiment of the present application. As shown in fig. 7, an image processing apparatus 700 may include an image acquisition module 710, a first determination module 720, a second determination module 730, and a third determination module 740.
The image obtaining module 710 may be configured to obtain an RGB image, where a photographic object included in the RGB image is a test card; the first determining module 720 may be configured to determine at least one connected region of the RGB image; the second determining module 730 may be configured to determine, for each connected region, a distance value set in the connected region, so as to obtain a distance value set corresponding to each of at least one connected region; the third determining module 740 may be configured to determine a color difference of the RGB image according to the respective distance value sets of the at least one connected region, so as to correct the RGB image according to the color difference.
Fig. 8 is a schematic structural view of an image processing apparatus provided in a second exemplary embodiment of the present application. As shown in fig. 8, the first determination module 720 may include a first determination unit 721, an image conversion unit 722, a traversal unit 723, and a marking unit 724. The first determining unit 721 may be configured to determine pixel values of all pixel points of any one color channel image in the RGB image; the image conversion unit 722 may be configured to convert the color channel image into a binarized image based on pixel values of all pixel points; the traversing unit 723 may be configured to traverse the pixel points in the binarized image to determine a connected region of each color channel image; the marking unit 724 may be configured to mark the connected areas of each color channel image, where the connected areas at the same positions in the three color channel images are marked with the same marks, so as to obtain the connected areas of the RGB images.
Fig. 9 is a schematic structural view of an image processing apparatus provided in a third exemplary embodiment of the present application. As shown in fig. 9, the second determining module 730 may include a second determining unit 731, a third determining unit 732, a fourth determining unit 733, and a fifth determining unit 734. The second determining unit 731 may be configured to determine coordinate values of centroids of the color channels according to coordinates of pixel points and the number of pixel points in each color channel in the communication area; the third determining unit 732 may be configured to determine a distance value of RG and a distance value of GB based on coordinate values of centroids of the three color channels of RGB; and the fourth determining unit 733 may be configured to determine, for each of the connected regions, a coordinate value of a centroid of the connected region according to coordinates of the pixel points and the number of the pixel points in the connected region; the fifth determination unit 734 is configured to determine a distance value of the connected region to a center point of the RGB image based on coordinate values of centroids of the connected region.
According to an exemplary embodiment of the present application, the image processing apparatus 700 may further include an input module (not shown in the drawing) for inputting a distance value of the connected region to the center point of the RGB image into the trained multi-layer feedforward neural network to output a distance value of RG and a distance value of GB corresponding to the distance value.
The image processing device provided by the application determines the color difference of the RGB image according to the distance value set by determining the distance value set of the connected region in the RGB image, so as to realize the measurement quantification of the color difference of the image, avoid the subjective judgment of the color difference and improve the measurement precision of the color difference.
Fig. 10 illustrates a block diagram of an electronic device according to an embodiment of the application.
As shown in fig. 10, the electronic device 11 includes one or more processors 111 and a memory 112.
The processor 111 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities, and may control other components in the electronic device 11 to perform desired functions.
Memory 112 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on the computer readable storage medium that may be executed by the processor 111 to implement the image processing methods and/or other desired functions of the various embodiments of the present application described above. Various contents such as an input signal, a signal component, a noise component, and the like may also be stored in the computer-readable storage medium.
In one example, the electronic device 11 may further include: an input device 113 and an output device 114, which are interconnected by a bus system and/or other forms of connection mechanisms (not shown).
For example, the input device 113 may be a camera or microphone, a microphone array, or the like as described above for capturing an image or an input signal of a sound source. When the electronic device is a stand-alone device, the input means 123 may be a communication network connector for receiving the acquired input signals from the neural network processor.
In addition, the input device 113 may also include, for example, a keyboard, a mouse, and the like.
The output device 114 may output various information to the outside, including the determined output voltage, output current information, and the like. The output device 114 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device 11 relevant to the present application are shown in fig. 10 for simplicity, components such as buses, input/output interfaces, and the like being omitted. In addition, the electronic device 11 may include any other suitable components depending on the particular application.
In addition to the methods and apparatus described above, embodiments of the application may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps in an image processing method according to various embodiments of the application described in the "exemplary methods" section of this specification.
The computer program product may write program code for performing operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium, on which computer program instructions are stored, which, when being executed by a processor, cause the processor to perform the steps in an image processing method according to various embodiments of the present application described in the "exemplary method" section above in the present specification.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present application have been described above in connection with specific embodiments, but it should be noted that the advantages, benefits, effects, etc. mentioned in the present application are merely examples and not intended to be limiting, and these advantages, benefits, effects, etc. are not to be construed as necessarily possessed by the various embodiments of the application. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, as the application is not necessarily limited to practice with the above described specific details.
The block diagrams of the devices, apparatuses, devices, systems referred to in the present application are only illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
It is also noted that in the apparatus, devices and methods of the present application, the components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered as equivalent aspects of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the application to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.

Claims (9)

1. An image processing method, comprising:
acquiring an RGB image, wherein a shooting object contained in the RGB image is a test card;
Determining at least one connected region of the RGB image;
Determining a distance value set in each connected region for each connected region to obtain a distance value set corresponding to each connected region, wherein the distance value set in each connected region comprises a distance value of RG, a distance value of GB and a distance value from the connected region to a central point of the RGB image, and the degree of chromatic aberration in the RGB image can be determined by the distance value contained in the distance value set;
Determining color differences of the RGB image according to the distance value sets corresponding to the at least one communication area respectively, so as to correct the RGB image according to the color differences;
wherein said determining a set of distance values in the connected region comprises:
Determining the coordinate value of the centroid of each color channel according to the coordinates of the pixel points and the number of the pixel points in each color channel in the communication area; determining a distance value of RG and a distance value of GB based on the coordinate values of the centroids of the RGB three color channels; for each communication area, determining the coordinate value of the centroid of the communication area according to the coordinates of the pixel points in the communication area and the number of the pixel points; and determining a distance value from the connected region to a center point of the RGB image based on the coordinate value of the centroid of the connected region.
2. The method of claim 1, wherein the determining at least one connected region of the RGB image comprises:
Determining pixel values of all pixel points of any color channel image in the RGB image;
converting the color channel image into a binarized image based on the pixel values of all the pixel points;
traversing pixel points in the binarized image to determine a connected region of each color channel image;
And marking the communication areas of each color channel image respectively, wherein the communication areas at the same corresponding positions in the three color channel images are marked by the same mark so as to obtain the communication areas of the RGB images.
3. The method of claim 2, wherein converting the color channel image into a binarized image based on the pixel values of all the pixel points comprises:
Comparing the original pixel value of each pixel point of the color channel image with a preset pixel threshold value for each color channel image;
if the original pixel value of any pixel point is smaller than the preset pixel threshold value, setting the pixel value of the pixel point as a first pixel value;
If the original pixel value of any pixel point is larger than the preset pixel threshold value, setting the pixel value of the pixel point as a second pixel value;
and determining an image formed by all the pixel points with the pixel values of the first pixel value and all the pixel points with the pixel values of the second pixel value as a binarized image.
4. The method of claim 2, wherein the traversing pixel points in the binarized image to determine a connected region for each of the color channel images comprises:
Selecting a reference pixel point from the binarized image to traverse all pixel points adjacent to the reference pixel point until the pixel value of a non-pixel point in the adjacent pixel points is the same as the pixel value of the reference pixel point;
And determining an image area formed by the reference pixel point and all adjacent pixel points with the same pixel value as the reference pixel point as a communication area so as to obtain the communication area of each color channel image.
5. The method of claim 1, wherein the method further comprises:
And inputting the distance value from the connected region to the central point of the RGB image into a trained multi-layer feedforward neural network to output the distance value of RG and the distance value of GB corresponding to the distance value.
6. An image processing apparatus comprising:
The image acquisition module is used for acquiring RGB images, wherein the shooting objects contained in the RGB images are test cards;
A first determining module for determining at least one connected region of the RGB image; the second determining module is used for determining a distance value set in each connected region according to each connected region so as to obtain a distance value set corresponding to each connected region; the distance value set in the connected region comprises a distance value of RG, a distance value of GB and a distance value from the connected region to a central point of the RGB image, wherein the distance value contained in the distance value set can determine the degree of chromatic aberration in the RGB image;
a third determining module, configured to determine a color difference of the RGB image according to a distance value set corresponding to each of the at least one connected region, so as to correct the RGB image according to the color difference;
The second determining module comprises a second determining unit, a third determining unit, a fourth determining unit and a fifth determining unit, wherein the second determining unit is used for determining coordinate values of barycenter of each color channel in the communication area according to the coordinates of the pixel points and the number of the pixel points in the color channel; the third determining unit is used for determining a distance value of RG and a distance value of GB based on coordinate values of barycenters of three RGB color channels; the fourth determining unit is used for determining coordinate values of barycenters of the communication areas according to the coordinates of the pixel points in the communication areas and the number of the pixel points for each communication area; the fifth determining unit is configured to determine a distance value of the connected region to a center point of the RGB image based on a coordinate value of a centroid of the connected region.
7. The apparatus of claim 6, wherein the first determination module comprises:
A first determining unit, configured to determine pixel values of all pixel points of any color channel image in the RGB image;
An image conversion unit for converting the color channel image into a binarized image based on the pixel values of all the pixel points;
a traversing unit, configured to traverse pixel points in the binarized image, so as to determine a connected region of each color channel image;
And the marking unit is used for marking the communication areas of each color channel image respectively, wherein the communication areas at the same corresponding positions in the three color channel images are marked by the same mark so as to obtain the communication areas of the RGB images.
8. A computer-readable storage medium storing a computer program for executing the image processing method according to any one of the preceding claims 1-5.
9. An electronic device, the electronic device comprising:
A processor;
A memory for storing the processor-executable instructions;
The processor is configured to perform the image processing method according to any one of the preceding claims 1-5.
CN201910295510.9A 2019-04-12 2019-04-12 Image processing method and device, readable storage medium and electronic equipment Active CN111815720B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910295510.9A CN111815720B (en) 2019-04-12 2019-04-12 Image processing method and device, readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910295510.9A CN111815720B (en) 2019-04-12 2019-04-12 Image processing method and device, readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111815720A CN111815720A (en) 2020-10-23
CN111815720B true CN111815720B (en) 2024-07-12

Family

ID=72843996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910295510.9A Active CN111815720B (en) 2019-04-12 2019-04-12 Image processing method and device, readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111815720B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781451B (en) * 2021-09-13 2023-10-17 长江存储科技有限责任公司 Wafer detection method, device, electronic equipment and computer readable storage medium
CN115482308B (en) * 2022-11-04 2023-04-28 平安银行股份有限公司 Image processing method, computer device, and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0219346D0 (en) * 2002-08-19 2002-09-25 Ici Plc A method for obtaining an approximate standard colour definition for a sample colour
US8339462B2 (en) * 2008-01-28 2012-12-25 DigitalOptics Corporation Europe Limited Methods and apparatuses for addressing chromatic abberations and purple fringing
CN101166285B (en) * 2006-10-16 2010-11-10 展讯通信(上海)有限公司 Automatic white balance method and device
JP5665451B2 (en) * 2010-09-21 2015-02-04 キヤノン株式会社 Image processing apparatus, magnification chromatic aberration correction method thereof, imaging apparatus, magnification chromatic aberration correction program, and recording medium
JP5818586B2 (en) * 2011-08-31 2015-11-18 キヤノン株式会社 Image processing apparatus and image processing method
CN109215090A (en) * 2017-06-30 2019-01-15 百度在线网络技术(北京)有限公司 For calculating the method, apparatus and server of color difference

Also Published As

Publication number Publication date
CN111815720A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
KR102595704B1 (en) Image detection method, device, electronic device, storage medium, and program
CN109068025B (en) Lens shadow correction method and system and electronic equipment
EP2312858A1 (en) Image processing apparatus, imaging apparatus, image processing method, and program
JP7292979B2 (en) Image processing device and image processing method
JP2007503030A5 (en)
CN105654469B (en) A kind of automatic analysis method and system of baby stool color
US11967040B2 (en) Information processing apparatus, control method thereof, imaging device, and storage medium
KR20090090333A (en) Focus tidal system and method
CN111815720B (en) Image processing method and device, readable storage medium and electronic equipment
US12188823B2 (en) Method for determining a tooth colour
US10096128B2 (en) Change degree deriving device, change degree deriving system, and non-transitory computer readable medium
US8279308B1 (en) Optimized log encoding of image data
CN114866754A (en) Automatic white balance method and device, computer readable storage medium and electronic equipment
JP2006031440A (en) Image processing method, image processing apparatus, image processing program and image processing system
CN115587948A (en) Image dark field correction method and device
US20170085753A1 (en) Image data generating apparatus, printer, image data generating method, and non-transitory computer readable medium
CN113315995B (en) Method and device for improving video quality, readable storage medium and electronic equipment
CN108198226B (en) Ceramic color identification method, electronic equipment, storage medium and device
TW202024994A (en) Image positioning system based on upsampling and method thereof
CN102667853B (en) Optical filter for binary sensor arranges study
CN112580578A (en) Binocular living camera face ranging method and system
CN114359764A (en) Method, system and related equipment for identifying illegal building based on image data
CN117156289A (en) Color style correction method, system, electronic device, storage medium and chip
CN112950509B (en) Image processing method and device and electronic equipment
JP7291389B2 (en) Object identification method, information processing device, information processing program, and lighting device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant