[go: up one dir, main page]

CN110278386B - Image processing method, device, storage medium and electronic device - Google Patents

Image processing method, device, storage medium and electronic device Download PDF

Info

Publication number
CN110278386B
CN110278386B CN201910579969.1A CN201910579969A CN110278386B CN 110278386 B CN110278386 B CN 110278386B CN 201910579969 A CN201910579969 A CN 201910579969A CN 110278386 B CN110278386 B CN 110278386B
Authority
CN
China
Prior art keywords
image
yuv
exposure time
synthesized
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910579969.1A
Other languages
Chinese (zh)
Other versions
CN110278386A (en
Inventor
康健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910579969.1A priority Critical patent/CN110278386B/en
Publication of CN110278386A publication Critical patent/CN110278386A/en
Application granted granted Critical
Publication of CN110278386B publication Critical patent/CN110278386B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

本申请公开了一种图像处理方法、装置、存储介质及电子设备。该方法包括:获取第一曝光时间;按照所述第一曝光时间获取多帧待合成YUV图像;对所述多帧待合成YUV图像进行合成处理,得到高动态范围图像;利用所述高动态范围图像,进行图像的预览或拍照或录像操作。本申请提供的图像处理方案得到的图像能够适用于图像的预览、拍照和录像。

Figure 201910579969

The present application discloses an image processing method, device, storage medium and electronic device. The method includes: acquiring a first exposure time; acquiring multiple frames of YUV images to be synthesized according to the first exposure time; synthesizing the multiple frames of YUV images to be synthesized to obtain a high dynamic range image; using the high dynamic range image, preview the image or take pictures or video. The image obtained by the image processing solution provided in this application can be applied to preview, photograph and video of the image.

Figure 201910579969

Description

Image processing method, image processing device, storage medium and electronic equipment
Technical Field
The present application belongs to the field of image technologies, and in particular, to an image processing method, an image processing apparatus, a storage medium, and an electronic device.
Background
A High-Dynamic Range (HDR) image can provide more Dynamic Range and image details than a general image. The electronic equipment can shoot multi-frame images with different exposure degrees in the same scene, and the dark part details of the overexposed image, the middle details of the normal exposure image and the bright part details of the underexposed image are combined to obtain the HDR image. However, images processed by the related HDR technology are difficult to be simultaneously suitable for preview, photograph, and video recording.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and electronic equipment, wherein an image obtained by processing can be suitable for previewing, photographing and recording.
An embodiment of the present application provides an image processing method, including:
acquiring a first exposure time;
acquiring a plurality of frames of YUV images to be synthesized according to the first exposure time;
synthesizing the multiple frames of YUV images to be synthesized to obtain a high dynamic range image;
and previewing or photographing or recording the image by using the high dynamic range image.
An embodiment of the present application provides an image processing apparatus, including:
the first acquisition module is used for acquiring first exposure time;
the second acquisition module is used for acquiring multiple frames of YUV images to be synthesized according to the first exposure time;
the synthesis module is used for carrying out synthesis processing on the multiple frames of YUV images to be synthesized to obtain a high dynamic range image;
and the processing module is used for previewing or photographing or recording the image by utilizing the high dynamic range image.
The embodiment of the application provides a storage medium, wherein a computer program is stored on the storage medium, and when the computer program is executed on a computer, the computer is enabled to execute the flow in the image processing method provided by the embodiment of the application.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the flow in the image processing method provided by the embodiment of the present application by calling the computer program stored in the memory.
In the embodiment of the application, proper exposure time is obtained firstly; then, acquiring a plurality of frames of YUV images to be synthesized according to the exposure time; then, synthesizing the multiple frames of YUV images to be synthesized to obtain a high dynamic range image; and finally, previewing or photographing or recording the image by using the high dynamic range image. Because the YUV image is obtained by adopting the proper exposure time and is subjected to noise reduction and other processing, the quality of the finally synthesized high-dynamic-range image is better. The high dynamic range image with better quality can be directly used for image preview, photographing and video recording. That is, the image obtained by the image processing scheme provided by the embodiment can be suitable for preview, photographing and video recording.
Drawings
The technical solutions and advantages of the present application will become apparent from the following detailed description of specific embodiments of the present application when taken in conjunction with the accompanying drawings.
Fig. 1 is a schematic flowchart of a first image processing method according to an embodiment of the present application.
Fig. 2 is a schematic flowchart of a second image processing method according to an embodiment of the present application.
Fig. 3 is a schematic flowchart of a third image processing method according to an embodiment of the present application.
Fig. 4 is a fourth flowchart illustrating an image processing method according to an embodiment of the present application.
Fig. 5 is a scene schematic diagram of an image processing method according to an embodiment of the present application
Fig. 6 is a fifth flowchart illustrating an image processing method according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
Fig. 9 is a schematic structural diagram of a second electronic device according to an embodiment of the present application.
Fig. 10 is a schematic structural diagram of an image processing circuit according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
It is understood that the execution subject of the embodiment of the present application may be an electronic device such as a smart phone or a tablet computer.
Referring to fig. 1, fig. 1 is a first schematic flow chart of an image processing method according to an embodiment of the present application, where the flow chart may include:
101. a first exposure time is acquired.
102. And acquiring a plurality of frames of YUV images to be synthesized according to the first exposure time.
The image processing method provided by the embodiment can be applied to electronic equipment with a camera module. The camera module of the electronic device may include an image processing circuit, which may include a camera and an image signal processor, wherein the camera may include at least one or more lenses and an image sensor. The lens is used for collecting external light source signals and supplying the external light source signals to the image sensor, and the image sensor senses the light source signals from the lens, converts the light source signals into a digitized original image, namely a RAW image, and supplies the RAW image to the image signal processor for processing. The image signal processor can perform format conversion, noise reduction and other processing on the RAW image to obtain a YUV image. Where RAW is in an unprocessed, also uncompressed, format, which may be referred to visually as a "digital negative". YUV is a color coding method in which Y represents luminance, U represents chrominance, and V represents density, and natural features contained therein can be intuitively perceived by the human eye from YUV images.
For example, the electronic device may acquire a first exposure time. The first exposure time may be determined by the electronic device according to the current shooting scene. For example, the electronic device performs analysis and learning on a plurality of shooting scenes in advance, so as to finally analyze which shooting scene corresponds to which exposure time. For example, assuming that the shooting scene a is currently taken, the electronic device determines that the exposure time corresponding to the shooting scene a is t1, and the first exposure time is t 1.
It should be noted that, in this embodiment, the first exposure time may be such that an image obtained by synthesizing a plurality of frames of images acquired according to the first exposure time has a high dynamic range.
After a shooting application program (such as a system application "camera" of the electronic device) is started according to a user operation, a scene aimed at by a camera of the electronic device is a shooting scene. For example, after the user clicks an icon of a "camera" application on the electronic device with a finger to start the "camera application", if the user uses a camera of the electronic device to align a scene including an XX object, the scene including the XX object is a shooting scene. From the above description, it will be understood by those skilled in the art that the shooting scene is not specific to a particular scene, but is a scene aligned in real time following the orientation of the camera.
In this embodiment, after the electronic device obtains the first exposure time, a plurality of frames of YUV images to be synthesized may be obtained according to the first exposure time. For example, the electronic device acquires the first exposure time as t1, and then the electronic device may acquire multiple frames of YUV images to be synthesized according to the exposure time of t 1. It should be noted that, in this embodiment, the exposure times of multiple frames of YUV images to be synthesized are the same.
The embodiment is not particularly limited to how many frames of YUV images to be synthesized are obtained, and a person skilled in the art can select a suitable number of YUV images to be synthesized according to actual needs, for example, the number of YUV images to be synthesized can be determined according to a maximum frame rate supported by the electronic device.
In this embodiment, the electronic device may first acquire a plurality of frames of RAW images by using an image sensor, then perform format conversion, noise reduction, and other processing on each frame of RAW image by using an image signal processor, and convert the RAW image into a YUV color space, so as to obtain a plurality of frames of YUV images to be synthesized, which are suitable for viewing by human eyes.
103. And synthesizing the multiple frames of YUV images to be synthesized to obtain the high dynamic range image.
For example, after obtaining multiple frames of YUV images to be synthesized, the electronic device may perform synthesis processing on the multiple frames of YUV images to be synthesized, so as to obtain a high dynamic range image. For example, after obtaining 4 frames of YUV images to be synthesized, the electronic device may perform synthesis processing on the 4 frames of YUV images to be synthesized, so as to obtain a high dynamic range image.
104. And previewing or photographing or recording the image by using the high dynamic range image.
For example, after obtaining the high dynamic range image, the electronic device may perform an image preview, photographing or video recording operation using the high dynamic range image. For example, the electronic device may display the high dynamic range image on a preview interface of a camera application of the electronic device for user preview. Alternatively, when the electronic device receives a photographing instruction, for example, the user presses a photographing button, the electronic device may directly display the high dynamic image as a photo output on the display screen for the user to view. Or, when the electronic device receives the video recording instruction, the electronic device may use the high-dynamic image as one of the frames of the video obtained by video recording.
In this embodiment, a proper exposure time is obtained first; then, acquiring a plurality of frames of YUV images to be synthesized according to the exposure time; then, synthesizing the multiple frames of YUV images to be synthesized to obtain a high dynamic range image; and finally, previewing or photographing or recording the image by using the high dynamic range image. Because the YUV image is obtained by adopting the proper exposure time and is subjected to noise reduction and other processing, the quality of the finally synthesized high-dynamic-range image is better. The high dynamic range image with better quality can be directly used for image preview, photographing and video recording. That is, the image obtained by the image processing scheme provided by the embodiment can be suitable for preview, photographing and video recording.
Referring to fig. 2, fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present application, where the flow chart may include:
201. and the electronic equipment acquires the reference YUV image according to the second exposure time.
In the present embodiment, the second exposure time is not limited. For example, an exposure time may be preset by the user, and determined as the second exposure time; alternatively, one exposure time may be randomly determined by the electronic device, determined as a second exposure time, and so on.
For example, when a camera of the electronic device is directed at a shooting scene, the electronic device automatically determines an exposure time according to the shooting scene, and the exposure time may be a second exposure time.
After obtaining a second exposure time according to the above manner, the electronic device may obtain the reference YUV image according to the second exposure time.
In some embodiments, the RAW image may be obtained by the image sensor, and then the image signal processor performs format conversion, noise reduction, and the like on the RAW image, so as to obtain the reference YUV image.
202. The electronic device determines a first exposure time based on the reference YUV image.
After the electronic device obtains the reference YUV image, the electronic device may determine the first exposure time according to the reference YUV image. For example, the electronic device can determine an overexposed region of the reference YUV image. And determining the first exposure time according to the size of the overexposed area.
For example, during the day, when a scene outside the window is photographed indoors, a large overexposed area may exist in the photographed picture. For example, the sky part will become a white, and a blue sky cannot be captured, and the sky part that becomes a white is an overexposed area. If the sky part is more, the overexposed area of the shot picture is more.
In some embodiments, when there are some regions in the image acquired by the electronic device that cannot embody the detailed portion, the regions may be determined as overexposed regions. For example, on a sunny day, the sky seen by the human eye typically includes a blue sky and a white cloud. If the blue sky and the white cloud cannot be seen in the part of the sky in the image acquired by the electronic device, and only a piece of white can be seen, the area of the sky is the overexposure area.
Since the shorter the exposure time, the more information amount of the overexposed region is acquired. Then, when the overexposure area is larger, the electronic device may determine a smaller exposure time as the first exposure time; when the overexposed area is small, the electronic device may determine a larger exposure time, which is determined as the first exposure time. However, the HDR image is obtained by image composition in a multi-frame short exposure manner in the present embodiment, and too short exposure time may result in too dark the finally composed HDR image. Therefore, when the first exposure time is determined, it cannot be determined to be too small. That is, in the present embodiment, the first exposure time is greater than the preset exposure time. Wherein, the preset exposure time can be set according to the actual situation. But the preset exposure time should not be set too low in order to make the brightness of the finally synthesized image appropriate.
203. The electronic device obtains a maximum frame rate supported by the electronic device.
204. The electronic device determines the number of targets according to a maximum frame rate supported by the electronic device.
205. The electronic equipment obtains a target number of YUV images to be synthesized according to the first exposure time.
It can be understood that the higher the number of images used to compose an HDR image, the better the quality of the final HDR image will be. Therefore, in the related art, more than 3 frames of images are generally used to synthesize the HDR image. Due to the limitation of hardware structure, the maximum frame rate supported by the electronic device is 60 fps. That is, the electronic device acquires at most 60 images per second. If more than 3 frames of images are used to compose an HDR image, there may be a noticeable pause phenomenon in previewing or recording the image.
For example, assume that the maximum frame rate supported by the electronic device is 60fps, i.e., 60 images can be acquired per second. Then, using 4 frames of images to compose an HDR image, only 15 frames of HDR images are available per second. And less than 24 frames per second may cause the user to feel stuck. Then presenting a 15 frame HDR image per second would make the user feel a noticeable karton phenomenon.
In order to avoid the click phenomenon, the present embodiment determines the number of YUV images to be synthesized for synthesizing an HDR image according to the maximum frame rate supported by the electronic device.
For example, if the maximum frame rate supported by the electronic device is 60fps, the electronic device obtains 2 frames of YUV images to be synthesized according to the first exposure time. If the maximum frame rate supported by the electronic device is 90fps, the electronic device acquires 3 frames of YUV images to be synthesized according to the first exposure time. This ensures that 30 frames of images are acquired and displayed per second so that the user does not experience a stuck condition.
It can be understood that, when the maximum frame rate supported by the electronic device is 90fps, the electronic device may also obtain 2 frames of YUV images to be synthesized for synthesis processing according to the first exposure time, so as to reduce the processing load of the processor. In order to obtain an HDR image with better quality, the electronic device may acquire as many YUV images to be synthesized as possible to synthesize the HDR image, while ensuring that the images are not jammed.
In the present embodiment, although the number of images used for synthesizing the HDR image may not be too large due to the frame rate of the electronic device, the present embodiment uses the YUV image to perform the multi-frame short exposure synthesis process. The YUV image is an image subjected to processing such as noise reduction, and the quality of a single frame YUV image is better than that of a RAW image, so that a finally synthesized HDR image is also better. Moreover, the HDR image is synthesized by adopting a multi-frame short exposure mode, so that the ghost of the HDR image is not obvious. However, the exposure time corresponding to the multi-frame short exposure is relatively short, and if the short exposure time is adopted, the luminance of the finally synthesized HDR image may be dark. For this case, after the HDR image is synthesized, the luminance of the HDR image may be further boosted to moderate the luminance of the HDR image.
206. And the electronic equipment synthesizes the target number of YUV images to be synthesized to obtain the high dynamic range image.
207. The electronic equipment utilizes the high dynamic range image to perform image preview or photographing or video recording operation.
For example, after the electronic device obtains 2 frames of YUV images to be synthesized, the electronic device may perform synthesis processing on the 2 frames of YUV images to obtain a high dynamic range image, i.e., an HDR image. After obtaining the HDR image, the electronic device may display the HDR image on a preview interface of the camera application for a user to preview, take a photograph, or record.
In some embodiments, to further improve the quality of the HDR image, the electronic device may detect whether the luminance of the HDR image is less than a preset luminance threshold; if the brightness of the HDR image is smaller than the preset brightness threshold, the electronic device may output the HDR image after performing brightness enhancement processing on the HDR image, so that the user may preview, photograph, or record the image.
Referring to fig. 3, fig. 3 is a third flowchart illustrating an image processing method according to an embodiment of the present disclosure, and in some embodiments, the process 202 may include:
2021. the electronic device calculates an HDR score or optical ratio of the reference YUV image, the HDR score being high or low to describe the size of the overexposed region of the reference YUV image.
2022. The electronic device determines a first exposure time from the HDR fraction or light ratio of the reference YUV image.
In some embodiments, after obtaining the reference YUV image, the electronic device may calculate an HDR score or light ratio of the reference YUV image, and after obtaining the HDR score or light ratio of the reference YUV image, the electronic device may determine the first exposure time according to the HDR score or light ratio of the reference YUV image. Wherein, the HDR score is used for describing the size of an overexposed area of the reference YUV image. A higher HDR score indicates a larger overexposed area; conversely, a lower HDR score indicates that there are smaller overexposed regions. The light ratio represents the light receiving ratio of the dark surface and the bright surface of the subject in the reference YUV image. The larger the light ratio is, the larger the overexposure area exists; conversely, a smaller light ratio indicates a smaller overexposed area.
When the overexposed area is larger, a shorter exposure time can be obtained to obtain the information amount of the more overexposed area, and when the overexposed area is smaller, a longer exposure time can be obtained to make the brightness of the finally synthesized HDR image moderate on the basis of obtaining a certain information amount of the overexposed area.
For example, when the HDR score of the reference YUV image is g1, the electronic device may determine that the first exposure time is t 1; when the HDR score of the reference YUV image is g2, the electronic device may determine that the first exposure time is t 2. Wherein g1> g2, t1< t 2.
In some embodiments, in an early stage, the electronic device may analyze and learn a large number of YUV images with an overexposure region, and analyze characteristics of the overexposure region. And in the later period, the electronic equipment can directly determine the overexposure area of the YUV image after the YUV image is acquired. And then, determining a first exposure time according to the size of the overexposure area of the YUV image. Wherein, the larger the overexposure area is, the smaller the first exposure time is; the smaller the overexposed area, the larger the first exposure time.
Referring to fig. 4, fig. 4 is a fourth flowchart illustrating an image processing method according to an embodiment of the present disclosure, and in some embodiments, the process 2022 may include:
20221. the electronic device obtains a mapping of HDR scores or light ratios to exposure times.
20222. And the electronic equipment determines the exposure time corresponding to the HDR fraction or the light ratio of the reference YUV image according to the mapping relation to obtain the first exposure time.
For example, the electronic device may preset the HDR score or the mapping of the light ratio to the exposure time.
For example, the mapping of HDR scores to exposure times may be as shown in table 1.
TABLE 1 mapping of HDR scores to exposure times
HDR score 50 60 70 80
Exposure time 4ms 3ms 2ms 1ms
The light ratio versus exposure time mapping can be as shown in table 2.
TABLE 2 light ratio versus Exposure time mapping
HDR score 1:1 1:2 1:4 1:8
Exposure time 4ms 3.5ms 3ms 2.5ms
The mapping of the HDR score to the exposure time may be as shown in table 3.
TABLE 1 mapping of HDR scores to exposure times
HDR score 31~40 41~50 51~60 61~70
Exposure time 4.5ms 3.5ms 2.5ms 1.5ms
That is, in the present embodiment, the mapping relationship between the HDR score or the light ratio and the exposure time may be that an HDR score or a light ratio corresponds to an exposure time; it may also be a range of HDR scores or a range of light ratios corresponding to an exposure time.
In some embodiments, the mapping relationship between the HDR score or light ratio and the exposure time may also be that a plurality of HDR scores or light ratios correspond to one exposure time.
It should be noted that, as to what manner to set the mapping relationship between the HDR fraction or the light ratio and the exposure time, the embodiment of the present application is not particularly limited, and a person skilled in the art may set an appropriate mapping relationship between the HDR fraction or the light ratio and the exposure time according to actual needs.
For example, after calculating the HDR score or the light ratio of the reference YUV image, the electronic device may obtain a mapping relationship between the HDR score or the light ratio and the exposure time, and then determine, according to the mapping relationship, the exposure time corresponding to the HDR score or the light ratio of the reference YUV image, and determine the exposure time as the first exposure time.
For example, if the electronic device calculates the HDR score of the reference YUV image to be 70 and the electronic device obtains the mapping relationship as shown in table 1, the first exposure time is 2 ms. Alternatively, if the electronic device calculates the light ratio to be 1:1 and the electronic device obtains the mapping relationship as shown in table 2, the first exposure time is 4 ms.
In some embodiments, to reduce the amount of computation, the electronic device may only compute the HDR score, thereby determining the first exposure time from the HDR score; alternatively, the electronic device may simply calculate the light ratio, thereby determining the first exposure time from the light ratio.
It should be noted that the calculation amount of the HDR score is relatively larger than the calculation amount of the optical ratio. The HDR image synthesized by the YUV image obtained by the first exposure time determined according to the HDR score is relatively good, so that the manner of determining the first exposure time can be determined according to actual requirements. For example, the electronic device may analyze its own performance; if the electronic device analyzes that the performance of the electronic device is not enough to support calculation of the HDR score, the electronic device can select to calculate a light ratio, and determine a first exposure time according to the light ratio; if the electronic device analyzes that its own performance can fully support the calculation of the HDR score, it may choose to calculate the HDR score from which the first exposure time is determined.
In this embodiment, after acquiring the target number of YUV images to be synthesized, the electronic device further synthesizes the target number of YUV images to be synthesized in a multi-frame short exposure manner, so as to obtain a high dynamic range image with good quality, where the high dynamic range image has the characteristics of insignificant ghost and small noise.
When synthesizing a target number of YUV images to be synthesized to obtain a high dynamic range image, the electronic device first performs multi-frame noise reduction synthesis on the target number of YUV images to be synthesized to obtain a noise-reduced synthesized image. It can be understood that, since the YUV image to be synthesized is a short-exposure image, it will retain more of the features of the brighter area in the shooting scene. Similarly, the noise-reduced composite image obtained by performing multi-frame noise reduction and synthesis on the target number of YUV images to be synthesized also more retains the characteristics of a brighter region in the shooting scene. At this time, the electronic device further increases the brightness of the noise-reduced composite image, so that the features of the bright area and the dark area in the shooting scene are presented at the same time, and a high dynamic range image of the shooting scene is obtained.
For example, suppose that the electronic device acquires 3 frames of YUV images to be synthesized (the maximum frame rate supported by the electronic device is 90fps), which are YUV image to be synthesized S1, YUV image to be synthesized S2, and YUV image to be synthesized S3, respectively.
The electronic device selects one of the YUV image to be synthesized S1, the YUV image to be synthesized S2, and the YUV image to be synthesized S3 as a standard image, and if the YUV image to be synthesized S1 is selected as the standard image, the electronic device aligns the YUV image to be synthesized S2 and the YUV image to be synthesized S3 with the YUV image to be synthesized S1, and calculates an average pixel value of each pixel point based on each aligned image (for example, if pixel values of a pixel point at a certain position in 3 YUV images to be synthesized are respectively: "0.8, 0.9, 1", the average pixel value of the pixel point at the position is calculated to be "0.9"). And then, obtaining a noise-reduced composite image according to the average pixel value of each position. The pixel values of the pixels of the standard image (i.e., the YUV image to be synthesized S1) may be adjusted to the calculated average pixel values, so as to obtain a noise-reduced synthesized image, and a new image may be generated according to the calculated average pixel values, and the generated image may be used as the noise-reduced synthesized image. For the noise-reduced composite image, the brightness of the composite image is further improved by the electronic equipment, and a high dynamic range image of a shooting scene is obtained.
In some embodiments, the electronic device may perform synthesis processing on the reference YUV image and the multiple frames of YUV images to be synthesized to obtain a high dynamic range image in the case that the frame rate of the electronic device supports.
It should be noted that, when the exposure time of the reference YUV image is longer than that of the YUV image to be synthesized, the luminance of the obtained high-dynamic-range image is greater than the luminance of the high-dynamic-range image obtained by synthesizing only multiple frames of YUV images to be synthesized by synthesizing the reference YUV image and multiple frames of YUV images to be synthesized.
For example, as shown in fig. 5, when in the same shooting scene, the electronic device may acquire a reference YUV image N at a random exposure time, i.e., a second exposure time, and then calculate an HDR score of the reference YUV image N; then, the electronic device may obtain a mapping relationship between the HDR score or the light ratio and the exposure time, and determine the exposure time corresponding to the HDR score of the reference YUV image N according to the mapping relationship, to obtain the first exposure time. Then, the electronic device may acquire multiple frames of YUV images to be synthesized according to the first exposure time, such as S1, S2, S3, S4, S5 and S6; and then, the electronic equipment can synthesize the multiple frames of YUV images to be synthesized to obtain a high-dynamic-range image. For example, the synthesis processing is performed on S1 and S2 to obtain the 1 st frame high dynamic range image, the synthesis processing is performed on S3 and S4 to obtain the 2 nd frame high dynamic range image, and the processing is performed on S5 and S6 to obtain the 3 rd frame high dynamic range image. The electronic device may display the 3 frames of high dynamic range images on a preview interface of a camera application of the electronic device for user preview. Or, when the electronic device receives a photographing instruction, the electronic device may display one of the frames of high dynamic range images as a photo output on a display screen for viewing by a user. Or when the electronic device receives a video recording instruction, the electronic device may use the 3 frames of high dynamic range images as the 1 st frame, the 2 nd frame and the 3 rd frame of the video obtained by video recording. The first exposure time may be the same as the second exposure time, or may be different from the second exposure time.
In some embodiments, the electronic device may also employ such a compositing strategy to composite HDR images when in the same shooting scene and when no moving objects are present in the shooting scene. For example, the reference YUV image N, the YUV image to be synthesized S1, and the YUV image to be synthesized S2 are synthesized to obtain a 1 st frame high dynamic range image, N, S3 and S4 are synthesized to obtain a 2 nd frame high dynamic range image, N, S5 and S6 are processed to obtain a 3 rd frame high dynamic range image, and thus a 3 rd frame high dynamic range image with relatively high luminance is finally obtained. And the exposure time of the reference YUV image is longer than that of the YUV image to be synthesized.
In other embodiments, when the electronic device is in a different shooting scene, the frame rate of the electronic device is 90 fps. The electronic device may first acquire 1 frame of a first reference YUV image at a random exposure time (second exposure time), and determine the first exposure time according to the reference YUV image; then, the electronic device may obtain 2 frames of the first YUV image to be synthesized according to the first exposure time, and perform synthesis processing on the first reference YUV image and the 2 frames of the first YUV image to be synthesized, so as to obtain a 1 st frame of the high dynamic range image. By analogy, the electronic device may obtain a 2 nd frame high dynamic range image, a 3 rd frame high dynamic range image, and so on. When the electronic equipment detects that the shooting scene is changed, the electronic equipment can acquire 1 frame of second reference YUV image at random time or the first exposure time, and determine a third exposure time according to the second reference YUV image; then, the electronic device may obtain 2 frames of second YUV images to be synthesized according to the third exposure time, and synthesize the second reference YUV image and the 2 frames of second YUV images to be synthesized to obtain a 4 th frame of high dynamic range image. By analogy, the electronic device may obtain a 5 th frame high dynamic range image, a 6 th frame high dynamic range image, and so on.
Referring to fig. 6, fig. 6 is a schematic diagram of a fifth flowchart of an image processing method according to an embodiment of the present application, where the flowchart may include:
301. the electronic device obtains a first exposure time.
The electronic device obtains an exposure time. The first exposure time may be determined by the electronic device according to the current shooting scene. For example, the electronic device analyzes and learns a plurality of shooting scenes in advance, so as to finally analyze which shooting scene corresponds to a large exposure time. For example, assuming that the shooting scene a is currently taken, the electronic device determines that the exposure time corresponding to the shooting scene a is t1, and the first exposure time is t 1.
In some embodiments, the RAW image may be obtained by the image sensor, and then the image signal processor performs format conversion, noise reduction, and the like on the RAW image, so as to obtain the reference YUV image. The electronic device can then determine a first exposure time based on the reference YUV image.
It should be noted that, in this embodiment, the first exposure time may be such that an image synthesized from a plurality of frames of images acquired according to the first exposure time has a high dynamic range.
302. The electronic device acquires a plurality of frames of RAW images according to the first exposure time.
303. The electronic equipment performs preset processing on each frame of RAW image to obtain a plurality of frames of YUV images to be synthesized.
304. The electronic equipment carries out synthesis processing on a plurality of frames of YUV images to be synthesized to obtain a high dynamic range image.
305. The electronic equipment utilizes the high dynamic range image to perform image preview or photographing or video recording operation.
For example, after obtaining the first exposure time, the electronic device may obtain multiple frames of RAW images according to the first exposure time, and then perform preset processing on each frame of RAW image to obtain multiple frames of YUV images to be synthesized. The preset processing may be noise reduction, format conversion, and the like. After obtaining multiple frames of YUV images to be synthesized, the electronic device may perform synthesis processing on the multiple frames of YUV images to be synthesized, so as to obtain a high dynamic range image.
After obtaining the high dynamic range image, the electronic device may perform an image preview or photographing or recording operation using the high dynamic range image. For example, the electronic device may display the high dynamic range image on a preview interface of a camera application of the electronic device for user preview. Alternatively, when the electronic device receives a photographing instruction, for example, the user presses a photographing button, the electronic device may directly display the high dynamic image as a photo output on the display screen for the user to view. Or, when the electronic device receives the video recording instruction, the electronic device may use the high-dynamic image as one of the frames of the video obtained by video recording.
The preset processing is processing that does not affect the synthesis of the high dynamic range image. For example, if the preset processing includes brightness enhancement processing, i.e. the brightness of the obtained YUV image is relatively bright, which may affect the synthesis of the high dynamic range image, the preset processing should not include brightness enhancement processing.
In some embodiments, flow 303 may include:
and the electronic equipment performs format conversion and noise reduction processing on each frame of RAW image to obtain a plurality of frames of YUV images to be synthesized.
For example, the electronic device may perform format conversion and noise reduction on each frame of RAW image to obtain multiple frames of YUV images to be synthesized, so that the quality of the YUV images to be synthesized is relatively good, and further, the quality of the image with a high dynamic range obtained by performing synthesis processing on the multiple frames of YUV images to be synthesized is relatively good.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure. The image processing apparatus includes: a first obtaining module 401, a second obtaining module 402, a synthesizing module 403 and a processing module 404.
A first obtaining module 401, configured to obtain a first exposure time;
a second obtaining module 402, configured to obtain multiple frames of YUV images to be synthesized according to the first exposure time;
a synthesizing module 403, configured to perform synthesizing processing on the multiple frames of YUV images to be synthesized to obtain a high dynamic range image;
and the processing module 404 is configured to perform image preview or photographing or video recording operations using the high dynamic range image.
In some embodiments, the first obtaining module 401 may be configured to: acquiring a reference YUV image according to the second exposure time; and determining a first exposure time according to the reference YUV image.
In some embodiments, the first obtaining module 401 may be configured to: calculating an HDR score or an optical ratio of the reference YUV image, wherein the HDR score is used for describing the size of an overexposed area of the reference YUV image; determining a first exposure time according to the HDR fraction or the light ratio of the reference YUV image.
In some embodiments, the first obtaining module 401 may be configured to: acquiring a mapping relation between the HDR fraction or the light ratio and the exposure time; and determining the exposure time corresponding to the HDR fraction or the light ratio of the reference YUV image according to the mapping relation to obtain a first exposure time.
In some embodiments, the synthesis module 403 may be configured to: acquiring a maximum frame rate supported by the electronic equipment; determining the number of targets according to the maximum frame rate supported by the electronic equipment; acquiring a target number of YUV images to be synthesized according to the first exposure time; and synthesizing the target number of YUV images to be synthesized to obtain the high dynamic range image.
In some embodiments, the second obtaining module 402 may be configured to: acquiring a plurality of frames of RAW images according to the first exposure time; and performing preset processing on each frame of RAW image to obtain a plurality of frames of YUV images to be synthesized.
In some embodiments, the second obtaining module 402 may be configured to: and carrying out format conversion and noise reduction on each RAW image to obtain a plurality of frames of YUV images to be synthesized.
The present embodiment provides a computer-readable storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to execute the flow in the image processing method provided by this embodiment.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the flow in the image processing method provided in this embodiment by calling the computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 8, fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
The electronic device 500 may include a camera module 501, a memory 502, a processor 503, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 8 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The camera module 501 may include a lens for collecting an external light source signal and providing the light source signal to the image sensor, an image sensor for sensing the light source signal from the lens, converting the light source signal into a digitized RAW image, i.e., a RAW image, and providing the RAW image to the image signal processor for processing. The image signal processor can perform format conversion, noise reduction and other processing on the RAW image to obtain a YUV image. Where RAW is in an unprocessed, also uncompressed, format, which may be referred to visually as a "digital negative". YUV is a color coding method in which Y represents luminance, U represents chrominance, and V represents density, and natural features contained therein can be intuitively perceived by the human eye from YUV images.
The memory 502 may be used to store applications and data. Memory 502 stores applications containing executable code. The application programs may constitute various functional modules. The processor 503 executes various functional applications and data processing by running an application program stored in the memory 502.
The processor 503 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 502 and calling the data stored in the memory 502, thereby performing overall monitoring of the electronic device.
In this embodiment, the processor 503 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 502 according to the following instructions, and the processor 503 runs the application programs stored in the memory 502, so as to execute:
acquiring a first exposure time;
acquiring a plurality of frames of YUV images to be synthesized according to the first exposure time;
synthesizing the multiple frames of YUV images to be synthesized to obtain a high dynamic range image;
and previewing or photographing or recording the image by using the high dynamic range image.
Referring to fig. 9, the electronic device 600 may include a camera module 601, a memory 602, a processor 603, a touch display 604, a speaker 605, a microphone 606, and the like.
The camera module 601 may include an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units that define an Image Signal Processing (Image Signal Processing) pipeline. The image processing circuit may include at least: a camera, an Image Signal Processor (ISP Processor), control logic, an Image memory, and a display. Wherein the camera may comprise at least one or more lenses and an image sensor. The image sensor may include an array of color filters (e.g., Bayer filters). The image sensor may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor and provide a set of raw image data that may be processed by an image signal processor.
The image signal processor may process the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the image signal processor may perform one or more image processing operations on the raw image data, gathering statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision. The raw image data can be stored in an image memory after being processed by an image signal processor. The image signal processor may also receive image data from an image memory.
The image Memory may be part of a Memory device, a storage device, or a separate dedicated Memory within the electronic device, and may include a DMA (Direct Memory Access) feature.
When image data is received from the image memory, the image signal processor may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to an image memory for additional processing before being displayed. The image signal processor may also receive processed data from the image memory and perform image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of the image signal processor may also be sent to an image memory, and the display may read image data from the image memory. In one embodiment, the image memory may be configured to implement one or more frame buffers.
The statistical data determined by the image signal processor may be sent to the control logic. For example, the statistical data may include statistical information of the image sensor such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, lens shading correction, and the like.
The control logic may include a processor and/or microcontroller that executes one or more routines (e.g., firmware). One or more routines may determine camera control parameters and ISP control parameters based on the received statistics. For example, the control parameters of the camera may include camera flash control parameters, control parameters of the lens (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), etc.
Referring to fig. 10, fig. 10 is a schematic structural diagram of the image processing circuit in the present embodiment. As shown in fig. 10, only aspects of the image processing technique related to the embodiment of the present invention are shown for convenience of explanation.
For example, the image processing circuitry may include: camera, image signal processor, control logic ware, image memory, display. The camera may include one or more lenses and an image sensor, among others. In some embodiments, the camera may be either a tele camera or a wide camera.
And the first image collected by the camera is transmitted to an image signal processor for processing. After the image signal processor processes the first image, statistical data of the first image (e.g., brightness of the image, contrast value of the image, color of the image, etc.) may be sent to the control logic. The control logic device can determine the control parameters of the camera according to the statistical data, so that the camera can carry out operations such as automatic focusing and automatic exposure according to the control parameters. The first image can be stored in the image memory after being processed by the image signal processor. The image signal processor may also read the image stored in the image memory for processing. In addition, the first image can be directly sent to the display for displaying after being processed by the image signal processor. The display may also read the image in the image memory for display.
In addition, not shown in the figure, the electronic device may further include a CPU and a power supply module. The CPU is connected with the logic controller, the image signal processor, the image memory and the display, and is used for realizing global control. The power supply module is used for supplying power to each module.
The memory 602 stores applications containing executable code. The application programs may constitute various functional modules. The processor 603 executes various functional applications and data processing by running an application program stored in the memory 602.
The processor 603 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 602 and calling data stored in the memory 602, thereby integrally monitoring the electronic device.
The touch display screen 604 may be used to receive user touch control operations for the electronic device. Speaker 605 may play sound signals. The microphone 606 may be used to pick up sound signals.
In this embodiment, the processor 603 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 602 according to the following instructions, and the processor 603 runs the application programs stored in the memory 602, so as to execute:
acquiring a first exposure time;
acquiring a plurality of frames of YUV images to be synthesized according to the first exposure time;
synthesizing the multiple frames of YUV images to be synthesized to obtain a high dynamic range image;
and previewing or photographing or recording the image by using the high dynamic range image.
In one embodiment, the processor 603 may perform the following when acquiring the first exposure time: acquiring a reference YUV image according to the second exposure time; and determining a first exposure time according to the reference YUV image.
In one embodiment, the processor 603 may perform the determining of the first exposure time according to the reference YUV image by: calculating an HDR score or an optical ratio of the reference YUV image, wherein the HDR score is used for describing the size of an overexposed area of the reference YUV image; determining a first exposure time according to the HDR fraction or the light ratio of the reference YUV image.
In one embodiment, when the processor 603 executes the determining of the first exposure time according to the HDR fraction or the light ratio of the reference YUV image, it may execute: acquiring a mapping relation between the HDR fraction or the light ratio and the exposure time; and determining the exposure time corresponding to the HDR fraction or the light ratio of the reference YUV image according to the mapping relation to obtain a first exposure time.
In an embodiment, before the acquiring multiple frames of YUV images to be synthesized according to the first exposure time, the processor 603 may further perform: acquiring a maximum frame rate supported by the electronic equipment; determining the number of targets according to the maximum frame rate supported by the electronic equipment; the processor 603 may perform, when acquiring multiple frames of YUV images to be synthesized according to the first exposure time: acquiring a target number of YUV images to be synthesized according to the first exposure time; the processor 603 may perform the synthesizing process on the multiple frames of YUV images to be synthesized, and when obtaining the high dynamic range image, may perform: and synthesizing the target number of YUV images to be synthesized to obtain the high dynamic range image.
In an embodiment, when the processor 603 executes the acquiring of multiple frames of YUV images to be synthesized according to the first exposure time, the following steps may be executed: acquiring a plurality of frames of RAW images according to the first exposure time; and performing preset processing on each frame of RAW image to obtain a plurality of frames of YUV images to be synthesized.
In an embodiment, when the processor 603 performs the preset processing on each RAW image to obtain multiple frames of YUV images to be synthesized, the method may perform: and carrying out format conversion and noise reduction on each RAW image to obtain a plurality of frames of YUV images to be synthesized.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the image processing method, and are not described herein again.
The image processing apparatus provided in the embodiment of the present application and the image processing method in the above embodiment belong to the same concept, and any method provided in the embodiment of the image processing method may be run on the image processing apparatus, and a specific implementation process thereof is described in the embodiment of the image processing method in detail, and is not described herein again.
It should be noted that, for the image processing method described in the embodiment of the present application, it can be understood by those skilled in the art that all or part of the process of implementing the image processing method described in the embodiment of the present application can be completed by controlling the relevant hardware through a computer program, where the computer program can be stored in a computer-readable storage medium, such as a memory, and executed by at least one processor, and during the execution, the process of the embodiment of the image processing method can be included. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the image processing apparatus according to the embodiment of the present application, each functional module may be integrated into one processing chip, each module may exist alone physically, or two or more modules may be integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The foregoing detailed description has provided an image processing method, an image processing apparatus, a storage medium, and an electronic device according to embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (7)

1. An image processing method, comprising:
acquiring a reference YUV image according to the second exposure time;
calculating an HDR score or an optical ratio of the reference YUV image, wherein the HDR score is used for describing the size of an overexposed area of the reference YUV image;
determining a first exposure time according to the HDR fraction or the light ratio of the reference YUV image;
acquiring a maximum frame rate supported by the electronic equipment;
determining the target number of YUV images to be synthesized for synthesizing one frame of high dynamic range image according to the maximum frame rate supported by the electronic equipment, so that the high dynamic range image obtained per second is not lower than 24 frames, and the maximum frame rate supported by the electronic equipment is not lower than 48 frames per second;
acquiring a target number of YUV images to be synthesized according to the first exposure time, wherein the exposure times of the target number of YUV images to be synthesized are the same;
synthesizing the target number of YUV images to be synthesized to obtain a frame of high dynamic range image;
and previewing or photographing or recording the image by using the high dynamic range image.
2. The method of claim 1, wherein determining the first exposure time according to the HDR fraction or light ratio of the reference YUV image comprises:
acquiring a mapping relation between the HDR fraction or the light ratio and the exposure time;
and determining the exposure time corresponding to the HDR fraction or the light ratio of the reference YUV image according to the mapping relation to obtain a first exposure time.
3. The image processing method according to claim 1, wherein said obtaining multiple frames of YUV images to be synthesized according to the first exposure time comprises:
acquiring a plurality of frames of RAW images according to the first exposure time;
and performing preset processing on each frame of RAW image to obtain a plurality of frames of YUV images to be synthesized.
4. The image processing method according to claim 3, wherein the performing the preset processing on each RAW image to obtain multiple frames of YUV images to be synthesized comprises:
and carrying out format conversion and noise reduction processing on each frame of RAW image to obtain a plurality of frames of YUV images to be synthesized.
5. An image processing apparatus characterized by comprising:
the first acquisition module is used for acquiring a reference YUV image according to the second exposure time; calculating an HDR score or an optical ratio of the reference YUV image, wherein the HDR score is used for describing the size of an overexposed area of the reference YUV image; determining a first exposure time according to the HDR fraction or the light ratio of the reference YUV image;
the second acquisition module is used for acquiring the maximum frame rate supported by the electronic equipment; determining the target number of YUV images to be synthesized for synthesizing one frame of high dynamic range image according to the maximum frame rate supported by the electronic equipment, so that the high dynamic range image obtained per second is not lower than 24 frames, and the maximum frame rate supported by the electronic equipment is not lower than 48 frames per second; acquiring a target number of YUV images to be synthesized according to the first exposure time, wherein the exposure times of the target number of YUV images to be synthesized are the same;
the synthesis module is used for synthesizing the target number of YUV images to be synthesized to obtain a frame of high dynamic range image;
and the processing module is used for previewing or photographing or recording the image by utilizing the high dynamic range image.
6. A storage medium having stored therein a computer program which, when run on a computer, causes the computer to execute the image processing method according to any one of claims 1 to 4.
7. An electronic device, characterized in that the electronic device comprises a processor and a memory, wherein the memory stores a computer program, and the processor is used for executing the image processing method according to any one of claims 1 to 4 by calling the computer program stored in the memory.
CN201910579969.1A 2019-06-28 2019-06-28 Image processing method, device, storage medium and electronic device Active CN110278386B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910579969.1A CN110278386B (en) 2019-06-28 2019-06-28 Image processing method, device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910579969.1A CN110278386B (en) 2019-06-28 2019-06-28 Image processing method, device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN110278386A CN110278386A (en) 2019-09-24
CN110278386B true CN110278386B (en) 2021-06-29

Family

ID=67962615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910579969.1A Active CN110278386B (en) 2019-06-28 2019-06-28 Image processing method, device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN110278386B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109963083B (en) * 2019-04-10 2021-09-24 Oppo广东移动通信有限公司 Image processor, image processing method, photographing device, and electronic equipment
CN112818732B (en) * 2020-08-11 2023-12-12 腾讯科技(深圳)有限公司 Image processing method, device, computer equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016134703A (en) * 2015-01-19 2016-07-25 三菱電機株式会社 Image processing device and method, and program and recording medium
JP2017112462A (en) * 2015-12-15 2017-06-22 キヤノン株式会社 Imaging device and control method, program therefor and storage medium
JP2017112461A (en) * 2015-12-15 2017-06-22 キヤノン株式会社 Imaging device and control method, program thereof, and storage medium
CN109506591A (en) * 2018-09-14 2019-03-22 天津大学 A kind of adaptive illumination optimization method being adapted to complex illumination scene

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011234342A (en) * 2010-04-08 2011-11-17 Canon Inc Image processor and control method thereof
JP2018074210A (en) * 2016-10-25 2018-05-10 キヤノン株式会社 Imaging apparatus
JP6824084B2 (en) * 2017-03-22 2021-02-03 キヤノン株式会社 Imaging device and its control method, program, storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016134703A (en) * 2015-01-19 2016-07-25 三菱電機株式会社 Image processing device and method, and program and recording medium
JP2017112462A (en) * 2015-12-15 2017-06-22 キヤノン株式会社 Imaging device and control method, program therefor and storage medium
JP2017112461A (en) * 2015-12-15 2017-06-22 キヤノン株式会社 Imaging device and control method, program thereof, and storage medium
CN109506591A (en) * 2018-09-14 2019-03-22 天津大学 A kind of adaptive illumination optimization method being adapted to complex illumination scene

Also Published As

Publication number Publication date
CN110278386A (en) 2019-09-24

Similar Documents

Publication Publication Date Title
CN110445988B (en) Image processing method, device, storage medium and electronic device
CN110266954B (en) Image processing method, device, storage medium and electronic device
KR102376901B1 (en) Imaging control method and imaging device
CN109040609B (en) Exposure control method, exposure control device, electronic equipment and computer-readable storage medium
CN110381263B (en) Image processing method, device, storage medium and electronic device
CN110033418B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110198417A (en) Image processing method, image processing device, storage medium and electronic equipment
CN110213502B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110445989B (en) Image processing method, device, storage medium and electronic device
CN111028190A (en) Image processing method, image processing device, storage medium and electronic equipment
CN110012227B (en) Image processing method, device, storage medium and electronic device
CN110198418B (en) Image processing method, device, storage medium and electronic device
CN110430370B (en) Image processing method, device, storage medium and electronic device
CN110445986B (en) Image processing method, image processing device, storage medium and electronic equipment
US20210168273A1 (en) Control Method and Electronic Device
CN110198419A (en) Image processing method, device, storage medium and electronic device
CN109993722A (en) Image processing method, device, storage medium and electronic device
CN110717871A (en) Image processing method, image processing device, storage medium and electronic equipment
CN110740266B (en) Image frame selection method and device, storage medium and electronic equipment
CN110290325B (en) Image processing method, device, storage medium and electronic device
CN110278375B (en) Image processing method, device, storage medium and electronic device
CN110572585B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110266967B (en) Image processing method, device, storage medium and electronic device
CN110278386B (en) Image processing method, device, storage medium and electronic device
CN110581957B (en) Image processing method, image processing device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant