CN113660428A - Image acquisition method and device and electronic equipment - Google Patents
Image acquisition method and device and electronic equipment Download PDFInfo
- Publication number
- CN113660428A CN113660428A CN202010366166.0A CN202010366166A CN113660428A CN 113660428 A CN113660428 A CN 113660428A CN 202010366166 A CN202010366166 A CN 202010366166A CN 113660428 A CN113660428 A CN 113660428A
- Authority
- CN
- China
- Prior art keywords
- image
- frames
- test images
- determining
- processed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/745—Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
The disclosure relates to an image acquisition method, an image acquisition device and electronic equipment, and belongs to the technical field of image acquisition. The image acquisition method can effectively eliminate the reflection area in the image, improve the image quality and optimize the user experience. The method is applied to the electronic equipment and comprises the following steps: acquiring the flicker frequency of ambient light in a shooting environment; acquiring an image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is greater than the flicker frequency; determining a light reflection area in the image to be processed according to the brightness of the at least two frames of test images; and carrying out reflection elimination processing on the reflection area of the image to be processed to generate a target image.
Description
Technical Field
The present disclosure relates to the field of image acquisition technologies, and in particular, to an image acquisition method and apparatus, and an electronic device.
Background
With the development of electronic devices, image capturing has become an essential function of electronic devices such as mobile phones. In the related art, an image acquired by an electronic device in some scenes has a light reflection area. For example, when a user shoots an outdoor scene through glass indoors, the acquired image may show reflection of an indoor light source on the glass. The light reflection region causes the image details to be blurred and even lost, the image quality is reduced, and the user experience is influenced.
Disclosure of Invention
The present disclosure provides an image acquisition method, an image acquisition device, and an electronic apparatus, which solve a defect that an image acquired in a related art may show a light reflection of an indoor light source on glass.
In a first aspect, an embodiment of the present disclosure provides an image obtaining method, where the method is applied to an electronic device, and the electronic device further includes: the system comprises a strobe frequency acquisition component and a camera shooting component; the method comprises the following steps:
acquiring the flicker frequency of ambient light in a shooting environment;
acquiring an image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is greater than the flicker frequency;
determining a light reflection area in the image to be processed according to the brightness of the at least two frames of test images;
and carrying out reflection elimination processing on the reflection area of the image to be processed to generate a target image.
In one embodiment, the determining a light reflection region within the image to be processed according to the brightness of the at least two frames of test images includes:
determining the pixel coordinates of the light reflecting area according to the brightness difference of the at least two frames of test images;
and determining the position of the light reflection area in the image to be processed according to the pixel coordinates.
In one embodiment, the determining pixel coordinates of a light reflection area according to the brightness difference of the at least two frames of test images comprises:
determining at least one group of inter-frame brightness change areas according to the brightness difference of any two frames of the test images;
determining a group of the inter-frame luminance change regions where the luminance difference is largest as the light reflection region;
the pixel coordinates of the light reflecting area are determined.
In one embodiment, determining at least one group of inter-frame brightness change regions according to the brightness difference between any two frames of the test image comprises:
and determining the area with the brightness difference absolute value larger than or equal to a set threshold value in any two frames of the test images as a group of the inter-frame brightness change areas.
In one embodiment, the acquiring the to-be-processed image and the at least two test images acquired at the set frame rate includes:
acquiring at least two frames of test images acquired at the set frame rate; synthesizing the image to be processed according to the obtained at least two frames of the test images;
or,
acquiring the at least two frames of test images acquired at the set frame rate and a first exposure; and acquiring the image to be processed acquired at the set frame rate and a second exposure time, wherein the second exposure time is longer than the first exposure time.
In one embodiment, the set frame rate is greater than or equal to twice the flicker frequency.
In a second aspect, an embodiment of the present disclosure provides an image capturing apparatus, where the image capturing apparatus is applied to an electronic device, and the electronic device further includes: the system comprises a strobe frequency acquisition component and a camera shooting component; the image acquisition apparatus includes:
the first acquisition module is used for acquiring the flicker frequency of the ambient light through the flicker frequency acquisition component;
the second acquisition module is used for acquiring an image to be processed and at least two frames of test images at a set frame rate through the camera shooting assembly, wherein the set frame rate is greater than the flicker frequency;
the determining module is used for determining a light reflecting area in the image to be processed according to the brightness of the at least two frames of test images; and
and the reflection eliminating module is used for carrying out reflection eliminating processing on the reflection area of the image to be processed to generate a target image.
In one embodiment, the determining module comprises:
a first determining unit, configured to determine pixel coordinates of the light reflection area according to a luminance difference between the at least two frames of test images; and
a second determining unit, configured to determine the light reflection area in the image to be processed according to the pixel coordinates.
In one embodiment, the first determination unit includes:
the first determining subunit is used for determining at least one group of inter-frame brightness change areas according to the brightness difference of any two frames of the test images;
a second determining subunit that determines, as the light reflection area, a group of the inter-frame luminance change areas where the luminance difference is largest; and
and the acquisition subunit is used for acquiring the pixel coordinates of the light reflection area.
In one embodiment, the first determining subunit is specifically configured to: and taking the area with the luminance difference absolute value larger than or equal to a set threshold value in any two frames of the test images as a group of the inter-frame luminance change areas.
In one embodiment, the second obtaining module comprises:
the first acquisition unit is used for acquiring at least two frames of test images acquired at the set frame rate; and
the synthesizing unit is used for synthesizing the image to be processed according to the acquired at least two frames of the test images;
or, the second obtaining module includes:
the second acquisition unit is used for acquiring the at least two frames of test images acquired at the set frame rate and the first exposure time; and
and the third acquisition unit is used for acquiring the image to be processed acquired at the set frame rate and a second exposure time, wherein the second exposure time is longer than the first exposure time.
In one embodiment, the set frame rate is greater than or equal to twice the flicker frequency.
In a third aspect, an embodiment of the present disclosure provides an electronic device, where the device includes:
a strobe acquisition component;
a camera assembly;
a memory storing executable instructions; and
a processor configured to execute the executable instructions in the memory to implement the steps of the image acquisition method provided by the first aspect described above.
In a fourth aspect, the embodiments of the present disclosure provide a method for acquiring an image, the method including steps of the image acquiring method according to the first aspect, and executable instructions stored thereon, when executed by a processor
The image acquisition method, the image acquisition device and the electronic equipment provided by the disclosure have at least the following beneficial effects:
the image obtaining method provided by the embodiment of the disclosure obtains the test images under the conditions of different environmental light brightness according to the environment light flicker frequency, and then positions the light reflection area based on the brightness change of at least two frames of test images. The method can quickly and accurately position the light reflecting area, and further carry out light reflection elimination processing on the light reflecting area in the image to be processed to obtain the image without the light reflecting area. In this way, the transparent glass for the user can also shoot a clear outdoor scene image, so that the image quality is improved, and the user experience is optimized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow diagram illustrating an image acquisition method according to an exemplary embodiment;
FIG. 2 is a flow chart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 3 is a flow chart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 4 is a flow chart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 5 is a flow chart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 6 is a flow chart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 7 is a block diagram of an image acquisition device shown in accordance with an exemplary embodiment;
FIG. 8 is a block diagram of an image acquisition device according to another exemplary embodiment;
FIG. 9 is a block diagram of an image acquisition device according to another exemplary embodiment;
FIG. 10 is a block diagram of an image acquisition device according to another exemplary embodiment;
FIG. 11 is a block diagram of an image acquisition device according to another exemplary embodiment;
FIG. 12 is a block diagram of an electronic device shown in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The use of the terms "a" or "an" and the like in the description and in the claims of this disclosure do not denote a limitation of quantity, but rather denote the presence of at least one. Unless otherwise indicated, the word "comprise" or "comprises", and the like, means that the element or item listed before "comprises" or "comprising" covers the element or item listed after "comprises" or "comprising" and its equivalents, and does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect.
As used in the specification and claims of this disclosure, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
Based on the defect that an image acquired in the related art has a light reflection region, the embodiment of the disclosure provides an image acquisition method, an image acquisition device and electronic equipment.
Before explaining an image acquisition method provided by an embodiment of the present disclosure, an electronic device to which the image acquisition method is applied is first described.
In one embodiment, an electronic device includes a strobe acquisition component and a camera component. The flash frequency acquisition assembly is used for acquiring ambient light and determining the flash frequency of the ambient light according to the acquired ambient light. Optionally, the flicker acquisition component is a flicker sensor (flicker sensor). The camera shooting assembly comprises a camera and an image sensor. Incident light is projected on the image sensor through the camera, and an image is output by the image sensor sensing the received light. In the embodiment of the disclosure, the image pickup assembly is a high-frequency camera with a refresh rate of more than 120 fps.
In one embodiment, the strobe frequency acquisition component and the camera component are external devices matched with the electronic device. Optionally, the strobe frequency acquisition assembly and the camera assembly are connected with the electronic device through wireless connection modes such as bluetooth and WiFi. Or the flash frequency acquisition component and the camera shooting component are connected with the electronic equipment in a wired connection mode such as a USB data line.
FIG. 1 is a schematic flow diagram illustrating an image acquisition method according to an exemplary embodiment. As shown in fig. 1, in combination with the electronic device, an image acquisition method provided by the embodiment of the present disclosure includes:
Indoor light sources (such as ceiling lamps, table lamps and the like) are powered by alternating current with the frequency of 50Hz or 60 Hz. Therefore, the luminance of the indoor light source periodically changes and rapidly flickers.
And acquiring the flicker frequency of the ambient light through the flicker frequency acquisition assembly. Moreover, when the flash frequency acquisition component is an external device of the electronic device, step 101 specifically includes: and receiving the data of the ambient light flicker frequency sent by the flicker frequency acquisition component. The flicker frequency acquisition component has the following two optional modes for acquiring the flicker frequency.
In a first alternative, the flash frequency acquiring component acquires the flash frequency of the ambient light in response to the shooting trigger operation. And the shooting triggering operation is also used for triggering the camera shooting assembly to work. Accordingly, step 101 is completed during image capture.
By adopting the mode, the ambient light flicker frequency acquired by the flicker frequency acquisition component is the ambient light flicker frequency of the shooting environment, and the accuracy of subsequently determining the light reflection area is improved.
In a second alternative, the flicker frequency acquisition component continuously acquires the flicker frequency of the ambient light at a set period. And the flicker frequency acquired at the last time before the shooting triggering operation is taken as the ambient light flicker frequency of the shooting environment.
Because the ambient light parameters of the shooting environment are relatively stable, the flash frequency acquisition assembly can also represent the ambient light flash frequency of the shooting environment in such a way, and the accuracy of subsequently determining the light reflection region is further improved.
102, acquiring an image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is greater than a flicker frequency.
In the embodiment of the disclosure, the test image is used for positioning the light reflection area, and the image to be processed is used for subsequent processing and finally presented on the display screen of the electronic equipment.
Optionally, the image to be processed and the at least two frames of test images are acquired by the camera assembly. For example, the shooting triggering operation triggers the camera shooting assembly to collect. When the camera module is an external device of the electronic device, step 102 specifically includes: and receiving data of the image to be processed and at least two frames of test images sent by the camera shooting assembly.
Because the set frame rate is greater than the flicker frequency, the test images can be respectively acquired under different environmental light intensities through the camera assembly. Preferably, the frame rate is set to be greater than or equal to 2 times the flicker frequency. At the moment, the image pickup assembly can respectively test images under the conditions of highest and lowest environmental light brightness, so that the brightness difference caused by the influence of environmental light frequency flashes of different frames of test images is the largest, and the light reflecting area can be accurately positioned through the test images.
Fig. 2, 3 are implementations of step 102 shown according to various exemplary embodiments.
As an example, as shown in fig. 2, step 102 specifically includes:
The image to be processed has more image details than the test image because the acquisition frame rate and the exposure time of the image to be processed are longer. In this way, the quality of the image finally presented on the display screen of the electronic device is guaranteed. In step 1021a and step 1022a, image capture is performed by the camera module.
As another example, as shown in fig. 3, step 102 specifically includes:
and step 1021b, acquiring at least two frames of test images acquired at the set frame rate.
And 1022b, synthesizing an image to be processed according to the acquired at least two frames of test images.
By adopting the method, rich image details of the image to be processed can be guaranteed, and the image quality is optimized. In step 1021b and step 1022b, image capture is performed by the camera module.
With continued reference to fig. 1, step 103 is performed after step 102, as follows:
and 103, determining a light reflection area in the image to be processed according to the brightness of at least two frames of test images.
Due to the brightness change of the indoor light source, the brightness of images presented based on indoor light in different frames of test images is different. That is, the brightness of the light reflecting area is different in different test images. However, the brightness of the image outside the light reflecting region in the test images of different frames is slightly influenced by the flicker of the ambient light, so that the brightness change region can be extracted by subtracting the brightness of the test images of two frames.
FIG. 4 is a flowchart illustrating step 103 according to an exemplary embodiment. In one embodiment, as shown in fig. 4, step 103 specifically includes:
and step 1031, determining the pixel coordinates of the light reflection region according to the brightness difference of the at least two frames of test images.
Different frames of test images in the at least two frames of test images correspond to different ambient light brightness, so that the light reflecting area can be positioned in the test images according to the brightness change of the test images. The pixel coordinates of the light reflecting region refer to the pixel coordinates of the boundary point of the light reflecting region. In the embodiment of the present disclosure, the number of boundary points is not limited. Of course, the more the number of boundary points, the more accurate the light reflecting region is determined.
And step 1032, determining the position of the light reflecting region in the image to be processed according to the pixel coordinates of the light reflecting region determined in the test image.
The electronic device is in a relatively stable state when shooting, and therefore the test image and the target frame image are considered to present the same shooting area in the embodiments of the present disclosure. In other words, the coordinate systems of the test image and the image to be processed are the same, and the shooting areas are the same. In such a case, the pixel coordinates of the light-reflecting area in the test image are the same as the pixel coordinates of the light-reflecting area in the target area.
By adopting the mode, the light reflection region is accurately positioned in the image to be processed based on the brightness change in the environment light flickering process, the subsequent light reflection elimination processing difficulty is reduced, and the processing effect and the processing speed of the light reflection elimination processing are optimized.
In one embodiment, fig. 5 is a flowchart illustrating step 1031, according to an exemplary embodiment. As shown in fig. 5, step 1031 includes:
Optionally, a region in which the absolute value of the luminance difference between the same pixels in any two frames of test images is greater than or equal to a set threshold is used as a group of inter-frame luminance change regions. In such a case, the set of inter-frame luminance variation regions includes at least one luminance variation sub-region. Take the scene of an outdoor image captured through glass indoors as an example. The indoor light source forms a plurality of light reflecting areas on the glass, so that the plurality of light reflecting areas are presented in a test image acquired by the electronic equipment. Moreover, the brightness of the light reflecting area is different in different test images under the influence of the flicker of the ambient light brightness. Furthermore, the inter-frame brightness change region obtained by the brightness difference of the two test images comprises a plurality of brightness change sub-regions.
It should be noted that, when the image obtaining component obtains two frames of test images, a group of inter-frame brightness change areas are determined according to the brightness difference of the two frames of test images. When the image acquisition assembly acquires three or more test images, comparing any two test images to acquire at least three groups of inter-frame brightness change areas.
And step 1031b, determining a group of inter-frame brightness change areas with the largest brightness difference as a light reflection area.
In the case where only one set of inter-frame luminance change areas is acquired in step 1031a, the set of areas is determined as a light reflection area. In the case where the inter-frame luminance change regions are acquired in step 1031a, the group having the largest luminance difference among the three groups is set as the light reflection region. The group with the largest difference in brightness indicates that the flicker brightness of the ambient light fluctuates most when the two test images are acquired. The reflective area can be accurately positioned by obvious change of the environmental light brightness.
Further, in the case where a plurality of sets of inter-frame luminance change regions are acquired in step 1031a and a set of inter-frame luminance change regions includes a plurality of luminance change sub-regions, the light reflection region is determined from the luminance difference of the designated region in the inter-frame luminance change region. For example, the designated area is the brightness change sub-area with the largest area in the inter-frame brightness change area (i.e., the brightness change sub-area covering the most pixels).
And step 1031c, obtaining the pixel coordinate of the light reflection region.
With continuing reference to fig. 1, after the light reflection area is determined in the image to be processed through step 103, the image acquisition method further includes step 104, specifically:
and 104, performing reflection elimination processing on the reflection area of the image to be processed to generate a target image.
And the target image is the image after the reflection of the image to be processed is eliminated. And in the process of reflection elimination, the image in the reflection area is restored according to the image around the reflection area so as to restore the image details lost in the reflection area.
Optionally, an image inpainting algorithm (inpainting algorithm) is used to perform reflection elimination processing on the reflection region in the image to be processed.
Illustratively, a Partial Differential Equation (PDE) method is used to repair the reflection region in the image to be processed. During processing, according to the light reflecting region determined in step 103, information outside the light reflecting region is diffused to the pixel to be repaired along the normal direction of the outline of the light reflecting region.
Illustratively, the reflective area is repaired by adopting a sample-based image repairing method (Exemplar-based repairing). When processed, a patch portion (patch) is defined at a border portion of the retroreflective region. The patch includes a boundary of the light reflecting area. The image outside the light-reflecting region is divided into a plurality of portions according to the size of the patch, and the portion having the highest similarity to the patch among the plurality of portions is determined as a reference portion. The content of the light reflecting area in patch is restored with the reference portion.
Optionally, a deep learning algorithm is used to construct a reflection elimination model, and the reflection area determined in step 103 is repaired by the reflection elimination model.
And in the process of reflection elimination, one frame with darker reflection area brightness or one frame with smaller reflection area brightness in the test image is used as a reference image for reflection elimination processing. The darker bright or smaller reflective areas indicate that the test image is less affected by ambient light. The reflection elimination processing rate and the original image restoration accuracy of the reflection area are improved through the reference image with higher accuracy.
The image acquisition method provided by the embodiment of the disclosure is explained below with reference to specific application examples. FIG. 6 is a flowchart illustrating an image acquisition method according to another exemplary embodiment. As shown in fig. 6, the acquiring method includes:
and step 610, acquiring the ambient light flicker frequency through the flicker frequency acquisition component.
Step 602, in response to the shooting trigger operation, acquiring an image to be processed and at least two frames of test images by the camera assembly at a frame rate 2 times of the flicker frequency.
Step 603, determining a light reflection area in the image to be processed according to the brightness of the at least two frames of test images. Wherein, the area with alternating brightness and darkness in at least two frames of test images is the light reflecting area.
And step 604, performing reflection elimination processing on the reflection area in the image to be processed to generate a target image. Wherein, an inpainting algorithm or a deep learning algorithm can be adopted to eliminate the reflective area in the image to be processed.
The image acquisition method provided by the embodiment of the disclosure acquires the test image under different environmental light brightness conditions according to the environmental light flicker frequency. The light-reflecting area is located based on the brightness variation of the at least two frames of test images. The method can quickly and accurately position the light reflecting area, and further carry out light reflection elimination processing on the light reflecting area in the image to be processed to obtain the image without the light reflecting area. In this way, the transparent glass for the user can also shoot a clear outdoor scene image, so that the image quality is improved, and the user experience is optimized.
The embodiment of the disclosure also provides an image acquisition device. The image acquisition device is applied to the electronic equipment provided by the above, and the electronic equipment comprises a flash frequency acquisition component and a camera component. FIG. 7 is a block diagram illustrating an image acquisition device according to an exemplary embodiment. As shown in fig. 7, the image pickup apparatus includes: a first acquiring module 701, a second acquiring module 702, a determining module 703, and a reflection eliminating module 704.
The first obtaining module 701 is configured to obtain a flicker frequency of ambient light of a shooting environment;
the second obtaining module 702 is configured to obtain an image to be processed and at least two frames of test images, which are collected at a set frame rate, where the set frame rate is greater than a flicker frequency;
the determining module 703 is configured to determine a light reflection area in the image to be processed according to the brightness of at least two frames of test images; and
the reflection eliminating module 704 is configured to perform reflection eliminating processing on a reflection area of the image to be processed, so as to generate a target image.
In one embodiment, the frame rate is set to be greater than or equal to twice the flicker frequency.
In one embodiment, fig. 8 is a block diagram illustrating a determination module according to an exemplary embodiment, and as shown in fig. 8, the determination module 703 includes: a first determining unit 7031 and a second determining unit 7032.
The first determining unit 7031 is configured to determine the pixel coordinates of the light reflection area according to the luminance difference between at least two frames of the test images.
The second determining unit 7032 is configured to determine the light reflection area in the image to be processed according to the pixel coordinates.
Fig. 9 is a block diagram illustrating a first determination unit according to an example embodiment. As shown in fig. 9, first determining unit 7031 includes: a first determining subunit 7031a, a second determining subunit 7031b, and a third determining subunit 7031 c.
The first determining subunit 7031a is configured to determine at least one group of inter-frame luminance change areas according to the luminance difference between any two frames of test images.
The second determining subunit 7031b is configured to determine, as the light reflection area, a group of inter-frame luminance change areas where the luminance difference is the largest.
The third determining subunit 7031c is configured to obtain the pixel coordinates of the light reflection area.
In an embodiment, the first determining subunit 7031a is specifically configured to: and taking the area with the luminance difference absolute value larger than or equal to the set threshold value in any two frames of test images as a group of inter-frame luminance change areas.
In one embodiment, FIG. 10 is a block diagram illustrating a second acquisition module, according to an example embodiment. As shown in fig. 10, the second obtaining module 702 includes: a first obtaining unit 7021a and a synthesizing unit 7022 a.
The first obtaining unit 7021a is configured to obtain at least two frames of test images acquired at a set frame rate.
The synthesizing unit 7022a is configured to synthesize an image to be processed according to the acquired at least two frames of the test image.
In one embodiment, FIG. 11 is a block diagram illustrating a second acquisition module in accordance with another example embodiment. As shown in fig. 11, second obtaining module 702 includes a second obtaining unit 7021b and a third obtaining unit 7022 b.
The second obtaining unit 7021b is configured to obtain at least two frames of test images acquired at the set frame rate and the first exposure time.
The third acquiring unit 7022b is configured to acquire the image to be processed acquired at the set frame rate and the second exposure time. Wherein the second exposure time is longer than the first exposure time.
The image acquisition device provided by the embodiment of the disclosure acquires the test image under different environmental light brightness conditions according to the environmental light flicker frequency. The light-reflecting area is located based on the brightness variation of the at least two frames of test images. The method can quickly and accurately position the light reflecting area, and further carry out light reflection elimination processing on the light reflecting area in the image to be processed to obtain the image without the light reflecting area. In this way, the transparent glass for the user can also shoot a clear outdoor scene image, so that the image quality is improved, and the user experience is optimized.
The embodiment of the disclosure also provides an electronic device, which comprises a memory and a processor besides the flash frequency acquisition component and the camera component. Wherein the memory stores executable instructions and the processor is configured to execute the executable instructions in the memory to implement the steps of the image acquisition method provided above.
Fig. 12 is a block diagram of a terminal device provided in accordance with an example embodiment. As shown in fig. 12, terminal device 1200 may include one or more of the following components: processing component 1202, memory 1204, power component 1206, multimedia component 1208, audio component 1210, input/output (I/O) interface 1212, sensor component 1214, communications component 1216, and image capture component.
The memory 1204 is configured to store various types of data to support operation at the terminal device 1200. Examples of such data include instructions for any application or method operating on terminal device 1200, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1204 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The multimedia component 1208 includes a screen providing an output interface between the terminal device 1200 and the target object. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a target object. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The I/O interface 1212 provides an interface between the processing component 1202 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc.
The sensor assembly 1214 includes one or more sensors for providing various aspects of state assessment for the terminal device 1200. For example, sensor assembly 1214 may detect an open/closed state of terminal device 1200, the relative positioning of components, such as a display and keypad of terminal device 1200, sensor assembly 1214 may also detect a change in the position of terminal device 1200 or one of the components, the presence or absence of a target object in contact with terminal device 1200, orientation or acceleration/deceleration of terminal device 1200, and a change in the temperature of terminal device 1200. As another example, the sensor assembly 1214 further includes a light sensor disposed below the OLED display screen.
In an exemplary embodiment, the terminal device 1200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components.
In an exemplary embodiment, the disclosed embodiment also provides a readable storage medium, and the readable storage medium stores executable instructions. The executable instructions can be executed by a processor of the terminal equipment to realize the steps of the power management chip control method. The readable storage medium may be, among others, ROM, Random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
Claims (14)
1. An image acquisition method, applied to an electronic device, the method comprising:
acquiring the flicker frequency of ambient light in a shooting environment;
acquiring an image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is greater than the flicker frequency;
determining a light reflection area in the image to be processed according to the brightness of the at least two frames of test images;
and carrying out reflection elimination processing on the reflection area of the image to be processed to generate a target image.
2. The method according to claim 1, wherein determining a light reflection region within the image to be processed according to the luminance of the at least two frames of test images comprises:
determining the pixel coordinates of the light reflecting area according to the brightness difference of the at least two frames of test images;
and determining the position of the light reflection area in the image to be processed according to the pixel coordinates.
3. The method of claim 2, wherein determining pixel coordinates of a light reflection region from a luminance difference of the at least two frames of test images comprises:
determining at least one group of inter-frame brightness change areas according to the brightness difference of any two frames of the test images;
determining a group of the inter-frame luminance change regions where the luminance difference is largest as the light reflection region;
the pixel coordinates of the light reflecting area are determined.
4. The method of claim 3, wherein determining at least one group of inter-frame luminance change regions according to the luminance difference between any two frames of the test image comprises:
and determining the area with the brightness difference absolute value larger than or equal to a set threshold value in any two frames of the test images as a group of inter-frame brightness change areas.
5. The method according to claim 1, wherein the acquiring the to-be-processed image and the at least two test images acquired at the set frame rate comprises:
acquiring at least two frames of test images acquired at the set frame rate; synthesizing the image to be processed according to the obtained at least two frames of the test images;
or,
acquiring the at least two frames of test images acquired at the set frame rate and the first exposure time; and acquiring the image to be processed acquired at the set frame rate and a second exposure time, wherein the second exposure time is longer than the first exposure time.
6. The method of claim 1, wherein the set frame rate is greater than or equal to twice the flicker frequency.
7. An image acquisition apparatus, wherein the image acquisition apparatus is applied to an electronic device, and the electronic device further includes: the system comprises a strobe frequency acquisition component and a camera shooting component; the image acquisition apparatus includes:
the first acquisition module is used for acquiring the flicker frequency of the ambient light;
the second acquisition module is used for acquiring an image to be processed and at least two frames of test images which are acquired at a set frame rate, wherein the set frame rate is greater than the flicker frequency;
the determining module is used for determining a light reflecting area in the image to be processed according to the brightness of the at least two frames of test images; and
and the reflection eliminating module is used for carrying out reflection eliminating processing on the reflection area of the image to be processed to generate a target image.
8. The apparatus of claim 7, wherein the determining module comprises:
a first determining unit, configured to determine pixel coordinates of the light reflection area according to a luminance difference between the at least two frames of test images; and
a second determining unit, configured to determine the light reflection area in the image to be processed according to the pixel coordinates.
9. The apparatus according to claim 8, wherein the first determining unit comprises:
the first determining subunit is used for determining at least one group of inter-frame brightness change areas according to the brightness difference of any two frames of the test images;
a second determining subunit that determines, as the light reflection area, a group of the inter-frame luminance change areas where the luminance difference is largest; and
a third determining subunit, configured to determine pixel coordinates of the light reflection area.
10. The apparatus according to claim 9, wherein the first determining subunit is specifically configured to: and taking the area with the luminance difference absolute value larger than or equal to a set threshold value in any two frames of the test images as a group of the inter-frame luminance change areas.
11. The apparatus of claim 7, wherein the second obtaining module comprises:
the first acquisition unit is used for acquiring at least two frames of test images acquired at the set frame rate; and
the synthesizing unit is used for synthesizing the image to be processed according to the acquired at least two frames of the test images;
or,
the second acquisition module includes:
the second acquisition unit is used for acquiring the at least two frames of test images acquired at the set frame rate and the first exposure; and
and the third acquisition unit is used for acquiring the image to be processed acquired at the set frame rate and a second exposure time, wherein the second exposure time is longer than the first exposure time.
12. The apparatus of claim 7, wherein the set frame rate is greater than or equal to twice the flicker frequency.
13. An electronic device, characterized in that the device comprises:
a strobe acquisition component;
a camera assembly;
a memory storing executable instructions; and
a processor configured to execute the executable instructions in the memory to implement the method of any of claims 1-6.
14. A readable storage medium having stored thereon executable instructions, wherein the executable instructions when executed by a processor implement the method of any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010366166.0A CN113660428B (en) | 2020-04-30 | 2020-04-30 | Image acquisition method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010366166.0A CN113660428B (en) | 2020-04-30 | 2020-04-30 | Image acquisition method and device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113660428A true CN113660428A (en) | 2021-11-16 |
CN113660428B CN113660428B (en) | 2023-07-04 |
Family
ID=78488637
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010366166.0A Active CN113660428B (en) | 2020-04-30 | 2020-04-30 | Image acquisition method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113660428B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114419076A (en) * | 2022-03-28 | 2022-04-29 | 杭州微影软件有限公司 | Data processing method and device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0556343A (en) * | 1991-08-22 | 1993-03-05 | Olympus Optical Co Ltd | Electronic image pickup device |
CN108573215A (en) * | 2018-03-16 | 2018-09-25 | 海信集团有限公司 | Reflective road method for detecting area, device and terminal |
CN109151255A (en) * | 2018-08-31 | 2019-01-04 | 惠州华阳通用电子有限公司 | A kind of camera flashing removing method and device based on Photoelectric Detection |
CN110263805A (en) * | 2019-05-07 | 2019-09-20 | 阿里巴巴集团控股有限公司 | Certificate verification, auth method, device and equipment |
CN110855889A (en) * | 2019-11-21 | 2020-02-28 | 重庆金山医疗技术研究院有限公司 | Image processing method, image processing apparatus, image processing device, and storage medium |
-
2020
- 2020-04-30 CN CN202010366166.0A patent/CN113660428B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0556343A (en) * | 1991-08-22 | 1993-03-05 | Olympus Optical Co Ltd | Electronic image pickup device |
CN108573215A (en) * | 2018-03-16 | 2018-09-25 | 海信集团有限公司 | Reflective road method for detecting area, device and terminal |
CN109151255A (en) * | 2018-08-31 | 2019-01-04 | 惠州华阳通用电子有限公司 | A kind of camera flashing removing method and device based on Photoelectric Detection |
CN110263805A (en) * | 2019-05-07 | 2019-09-20 | 阿里巴巴集团控股有限公司 | Certificate verification, auth method, device and equipment |
CN110855889A (en) * | 2019-11-21 | 2020-02-28 | 重庆金山医疗技术研究院有限公司 | Image processing method, image processing apparatus, image processing device, and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114419076A (en) * | 2022-03-28 | 2022-04-29 | 杭州微影软件有限公司 | Data processing method and device |
Also Published As
Publication number | Publication date |
---|---|
CN113660428B (en) | 2023-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10674088B2 (en) | Method and device for acquiring image, terminal and computer-readable storage medium | |
CN108986199B (en) | Virtual model processing method and device, electronic equipment and storage medium | |
US9674395B2 (en) | Methods and apparatuses for generating photograph | |
CN111586282B (en) | Shooting method, shooting device, terminal and readable storage medium | |
US10382734B2 (en) | Electronic device and color temperature adjusting method | |
CN112201211B (en) | Ambient light collection method and device, terminal and storage medium | |
CN108122195B (en) | Picture processing method and device | |
CN108810422B (en) | Light supplementing method and device for shooting environment and computer readable storage medium | |
CN108040204B (en) | Image shooting method and device based on multiple cameras and storage medium | |
CN106484356A (en) | Adjust the method and device of brightness of image | |
CN106775246B (en) | Screen brightness adjusting method and device | |
US11617023B2 (en) | Method for brightness enhancement of preview image, apparatus, and medium | |
CN113660428B (en) | Image acquisition method and device and electronic equipment | |
CN112312034B (en) | Exposure method and device of image acquisition module, terminal equipment and storage medium | |
CN111586280A (en) | Shooting method, shooting device, terminal and readable storage medium | |
CN114173053A (en) | Method and device for determining shooting parameters of camera, electronic equipment and storage medium | |
CN111835941B (en) | Image generation method and device, electronic equipment and computer readable storage medium | |
CN108156381B (en) | Photographing method and device | |
CN113315903A (en) | Image acquisition method and device, electronic equipment and storage medium | |
CN113241044B (en) | Screen brightness adjusting method, device, equipment and storage medium | |
CN114187874B (en) | Brightness adjusting method, device and storage medium | |
CN111883040B (en) | Display effect adjusting method and device of display screen, terminal equipment and medium | |
WO2023236215A1 (en) | Image processing method and apparatus, and storage medium | |
CN111277754B (en) | Mobile terminal shooting method and device | |
CN115550539A (en) | Function control method, function control device, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |