CN117793544B - Image processing method and related device - Google Patents
Image processing method and related device Download PDFInfo
- Publication number
- CN117793544B CN117793544B CN202410217098.XA CN202410217098A CN117793544B CN 117793544 B CN117793544 B CN 117793544B CN 202410217098 A CN202410217098 A CN 202410217098A CN 117793544 B CN117793544 B CN 117793544B
- Authority
- CN
- China
- Prior art keywords
- image
- exposure
- frame
- images
- dynamic range
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 14
- 230000003287 optical effect Effects 0.000 claims abstract description 59
- 238000000034 method Methods 0.000 claims abstract description 42
- 238000012545 processing Methods 0.000 claims description 24
- 230000015654 memory Effects 0.000 claims description 21
- 238000004422 calculation algorithm Methods 0.000 claims description 16
- 238000001914 filtration Methods 0.000 claims description 7
- 230000000694 effects Effects 0.000 abstract description 3
- 238000004891 communication Methods 0.000 description 15
- 230000033001 locomotion Effects 0.000 description 13
- 239000011159 matrix material Substances 0.000 description 9
- 238000010295 mobile communication Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000004927 fusion Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 239000011800 void material Substances 0.000 description 2
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Landscapes
- Studio Devices (AREA)
Abstract
The application provides an image processing method and a related device. The method can be realized: firstly, acquiring multi-frame low dynamic range images with different exposure degrees; then, determining a reference image from the multi-frame low dynamic range image; next, determining optical flow information between images adjacent to shooting time in the multi-frame low dynamic range image; based on the optical flow information between the images adjacent to the shooting time in the multi-frame low dynamic range image, the position of the target object in the multi-frame non-reference image in the multi-frame low dynamic range image is estimated when the reference image corresponds to the shooting time; moving a target object in the non-reference image to a position when the reference image corresponds to the shooting time, and obtaining a multi-frame presumption image; determining a first key image and a second key image from the multi-frame presumption image; the first key image, the second key image, and the reference image are fused into a high dynamic range image. In this way, the effect of the synthesized high dynamic range image can be improved.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and a related device.
Background
With the continued development of electronic devices, more and more users prefer to take images through the electronic devices. In capturing high dynamic range images, electronic devices typically synthesize high dynamic range (HIGH DYNAMIC RANGE, HDR) images using multiple frames of low dynamic range (low DYNAMIC RANGE, LDR) images. However, when there is a moving target object in a photographed scene, the effect is not good when a high dynamic range image synthesized using a plurality of frames of low dynamic range images.
Disclosure of Invention
The application provides an image processing method and a related device, which can realize the synthesis of a high dynamic range image through multiple frames of low dynamic range images, thereby improving the effect of the synthesized high dynamic range image.
In a first aspect, the present application provides an image processing method, applied to an electronic device, where the electronic device includes a camera, the method includes: acquiring multi-frame low dynamic range images with different exposure degrees according to a shooting time sequence through the camera; determining a reference image from the multi-frame low dynamic range image; determining optical flow information between images adjacent to shooting time in the multi-frame low dynamic range image; based on the optical flow information between the images adjacent to the shooting time in the multi-frame low dynamic range image, the position of the target object in the multi-frame non-reference image in the multi-frame low dynamic range image is estimated when the reference image corresponds to the shooting time, and the non-reference image is different from the reference image; based on the position of the target object in a plurality of frames of non-reference images in the plurality of frames of low dynamic range images when the reference image corresponds to the shooting time, moving the target object in the non-reference images to the position when the reference image corresponds to the shooting time, and obtaining a plurality of frames of estimated images; determining a first key image and a second key image from the multi-frame presumption image, wherein the position of the target object in the non-reference image corresponding to the first key image is not overlapped with the position of the target object in the non-reference image corresponding to the second key image; and fusing the first key image, the second key image and the reference image into a high dynamic range image, wherein the brightness range of the high dynamic range image is wider than that of the low dynamic range image.
The application provides an image processing method, which can determine a reference image, a first key image and a second key image from multiple frames of low dynamic range images and fuse the reference image, the first key image and the second key image into a high dynamic range image. In this way, the fused high dynamic range image can be better achieved by finding out the optimal reference image, the first key image and the second key image.
In one possible implementation, the plurality of frames of low dynamic range images include a plurality of frames of high exposure images and a plurality of frames of low exposure images, the high exposure images having a greater exposure than the low exposure images.
In one possible implementation, the method further includes: determining an exposure value of the multi-frame low dynamic range image; determining the lowest exposure image with the lowest exposure value from the multi-frame low dynamic range image; brightness alignment is carried out on the multi-frame high exposure image to the lowest exposure image; the method for determining the reference image from the multi-frame low dynamic range image specifically comprises the following steps: the reference image is determined from the multi-frame high exposure image with aligned brightness. Therefore, the multi-frame high exposure image is subjected to brightness alignment towards the lowest exposure image, the quality of the high exposure image can be improved, optical flow information between adjacent frames of the images can be calculated conveniently, and the reference image determined from the multi-frame high exposure image with the aligned brightness has more detail information and fewer noise points.
In one possible implementation, the method further includes: and integrally aligning the multi-frame low exposure image and the multi-frame high exposure image with the reference image after the brightness alignment. Thus, the multi-frame low exposure image and the multi-frame high exposure image after brightness alignment are integrally aligned to the reference image, so that the calculation amount and time of a subsequent algorithm, such as an image fusion algorithm, can be reduced.
In one possible implementation manner, determining optical flow information between images adjacent to shooting time in the multi-frame low dynamic range image specifically includes: and determining the multi-frame low exposure image after the whole alignment and optical flow information between the images adjacent to the shooting time in the multi-frame high exposure image after the brightness alignment and the whole alignment. In this way, through the optical flow information between the multiple frames of low exposure images after the integral alignment and the multiple frames of high exposure images after the brightness alignment and the integral alignment, the position of the target object in the multiple frames of non-reference images can be conveniently and subsequently estimated when the reference image corresponds to the shooting time.
In one possible implementation manner, based on optical flow information between images adjacent to shooting time in the multiple frames of low dynamic range images, the estimating the position of the target object in multiple frames of non-reference images in the multiple frames of low dynamic range images when the reference image corresponds to the shooting time specifically includes: based on the multi-frame low exposure image after the integral alignment and the optical flow information between the images adjacent to the shooting time in the multi-frame high exposure image after the brightness alignment and the integral alignment, the position of the target object in the multi-frame non-reference image when the reference image corresponds to the shooting time is estimated; wherein the multi-frame non-reference image includes the multi-frame low exposure image after the global alignment and the multi-frame high exposure image after the brightness alignment and the global alignment.
In one possible implementation manner, based on the optical flow information between the multiple frames of low-exposure images after the overall alignment and the multiple frames of high-exposure images after the brightness alignment and the imaging time adjacent images in the multiple frames of high-exposure images after the overall alignment, the method for estimating the position of the target object in the multiple frames of non-reference images when the reference image corresponds to the imaging time specifically includes: based on the multi-frame low exposure image after the integral alignment and the optical flow information between the images adjacent to the shooting time in the multi-frame high exposure image after the brightness alignment and the integral alignment, the position of the target object in the multi-frame non-reference image when the reference image corresponds to the shooting time is estimated through a Kalman filtering algorithm. In this way, the position of the target object in the multi-frame non-reference image, which is estimated by the kalman filter algorithm, can be more accurate when the reference image corresponds to the shooting time.
In one possible implementation manner, the brightness alignment of the multi-frame high exposure image to the lowest exposure image specifically includes: the exposure value of the multi-frame high exposure image is divided by the exposure value of the lowest exposure image.
In one possible implementation, the multi-frame low dynamic range image of different exposure levels includes multi-frame low dynamic range images of K exposure levels, K being a positive integer greater than 2.
In one possible implementation, the method further includes: determining a brightness distribution histogram of the multi-frame low dynamic range image, wherein the brightness distribution histogram is used for indicating the distribution condition of the brightness level and the pixel number of the low dynamic range image; determining a reference image from a multi-frame low dynamic range image, wherein the reference image specifically comprises; and determining the reference image with the most uniform brightness distribution from the multiple frames of low dynamic range images based on the brightness distribution histogram of the multiple frames of low dynamic range images. Thus, the reference image with the most uniform brightness distribution determined from the multi-frame low dynamic range image has better detail information.
In one possible implementation, the method further includes: determining an exposure value of the multi-frame low dynamic range image; determining the lowest exposure image with the lowest exposure value from the multi-frame low dynamic range image; and carrying out brightness alignment on a plurality of frames of non-lowest exposure images in the plurality of frames of low dynamic range images to the lowest exposure image, wherein the plurality of frames of non-lowest exposure images comprise the reference image, and the plurality of frames of non-lowest exposure images are different from the lowest exposure image. Therefore, the multi-frame non-minimum exposure image is subjected to brightness alignment to the minimum exposure image, so that the quality of the multi-frame non-minimum exposure image can be improved, and the optical flow information between adjacent frames of the images can be calculated conveniently.
In one possible implementation, the method further includes: the multi-frame non-lowest exposure image after the lowest exposure image is aligned with brightness is aligned with the reference image as a whole. Thus, the plurality of frames of non-minimum exposure images after the minimum exposure image and the brightness are aligned integrally with the reference image, so that the calculation amount and time of the subsequent algorithm, such as an image fusion algorithm, can be reduced.
In one possible implementation manner, the determining optical flow information between images adjacent to shooting time in the multi-frame low dynamic range image specifically includes: and determining the lowest exposure image after the overall alignment and optical flow information between the images adjacent to the shooting time in the multi-frame non-lowest exposure image after the brightness alignment and the overall alignment. In this way, through the determined lowest exposure image after the whole alignment and the optical flow information between the images with adjacent shooting times in the multiple frames of non-lowest exposure images after the brightness alignment and the whole alignment, the position of the target object in the multiple frames of non-reference images can be conveniently and subsequently estimated when the reference image corresponds to the shooting time.
In one possible implementation manner, the estimating, based on optical flow information between images adjacent to each other in shooting time in the multiple frames of low dynamic range images, a position of a target object in multiple frames of non-reference images in the multiple frames of low dynamic range images when the reference image corresponds to the shooting time specifically includes: based on the lowest exposure image after the integral alignment and the optical flow information between the images adjacent to the shooting time in the multi-frame non-lowest exposure image after the brightness alignment and the integral alignment, the position of the target object in the multi-frame non-reference image when the reference image corresponds to the shooting time is estimated; wherein the multi-frame non-reference image includes the lowest exposure image after the global alignment and the multi-frame non-lowest exposure image after the brightness alignment and the global alignment.
In one possible implementation manner, the estimating the position of the target object in the multiple frames of non-reference images when the reference image corresponds to the shooting time based on the lowest exposure image after the overall alignment and the optical flow information between the images adjacent to the shooting time in the multiple frames of non-lowest exposure images after the brightness alignment and the overall alignment specifically includes: based on the lowest exposure image after the integral alignment and the optical flow information between the images adjacent to the shooting time in the multiple frames of non-lowest exposure images after the brightness alignment and the integral alignment, the position of the target object in the multiple frames of non-reference images when the reference image corresponds to the shooting time is estimated through a Kalman filtering algorithm. In this way, the position of the target object in the multi-frame non-reference image, which is estimated by the kalman filter algorithm, can be more accurate when the reference image corresponds to the shooting time.
In one possible implementation manner, performing brightness alignment on a plurality of frames of non-lowest exposure images in the plurality of frames of low dynamic range images to the lowest exposure image specifically includes: the exposure value of the multi-frame non-lowest exposure image in the multi-frame low dynamic range image is divided by the exposure value of the lowest exposure image.
In a second aspect, the present application provides an electronic device comprising a camera, one or more processors, and one or more memories; wherein the camera, the one or more memories are coupled to the one or more processors, the one or more memories for storing computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of the possible implementations of the first aspect.
In a third aspect, the application provides a chip for use in an electronic device comprising processing circuitry and interface circuitry for receiving instructions and transmitting to the processing circuitry, the processing circuitry for executing instructions to perform a method in any of the possible implementations of the first aspect.
In a fourth aspect, the application provides a computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
Fig. 2A-2C are schematic diagrams of a low dynamic range image and a high dynamic range image according to an embodiment of the present application;
FIG. 3 is a flowchart of a method for image processing according to an embodiment of the present application;
FIGS. 4A-4C provide reference, non-reference, and speculative images for embodiments of the present application;
FIGS. 5A-5B are first and second key images provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of an image fusion according to an embodiment of the present application;
FIGS. 7A-7C are schematic diagrams illustrating the overall alignment of a set of images provided by embodiments of the present application;
fig. 8 is a schematic diagram of a luminance distribution histogram according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and furthermore, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The following describes a hardware structure of an electronic device provided in an embodiment of the present application.
Fig. 1 shows a schematic hardware structure of an electronic device 100 according to an embodiment of the present application.
It should be understood that the electronic device 100 shown in fig. 1 is only one example, and that the electronic device 100 may have more or fewer components than shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, among others. Among them, the sensor module 180 may include a gyro sensor 180B, an acceleration sensor 180E, a touch sensor 180K, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD). The display panel may also be manufactured using organic light-emitting diodes (OLED), active-matrix organic LIGHT EMITTING diode (AMOLED), flexible light-emitting diodes (FLED), miniled, microled, micro-OLED, quantum dot LIGHT EMITTING diodes (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like. The ISP is used to process data fed back by the camera 193. The camera 193 is used to capture still images or video. The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power key, a volume key, etc. The motor 191 may generate a vibration cue. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card.
In the embodiment of the present application, the device type of the electronic device 100 may be any one of a mobile phone, a tablet computer, a handheld computer, a desktop computer, a laptop computer, a ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an intelligent home device such as an intelligent large screen, an intelligent sound box, a wearable device such as an intelligent bracelet, an intelligent watch, an intelligent glasses, an augmented reality (augmented reality, AR), a Virtual Reality (VR), an extended reality (XR) device such as a Mixed Reality (MR), a vehicle-mounted device, or an intelligent city device, and the like.
Some concepts of the image processing process involved in the embodiments of the present application are described below.
(1) Low dynamic range image: the width of the luminance range of the low dynamic range image is relatively narrow, for example, the luminance range of the low exposure image in the low dynamic range image is 0 lux (lux) to 25000 lux (lux), and the luminance range of the high exposure image in the low dynamic range image is 25000 lux (lux) to 40000 lux (lux). Thus, the low dynamic range image cannot accurately restore the details of the higher brightness region and the details of the lower brightness region in the real scene.
(2) High dynamic range image: the width of the luminance range of the high dynamic range image is relatively wide, for example, the luminance range of the high dynamic range image is 0 lux to 40000 lux. In this way, the high dynamic range image can represent more details in the real scene, including details of the higher brightness region and details of the lower brightness region.
(3) Fusion: the high dynamic range image may be synthesized from multiple frames of low dynamic range images. When the multi-frame low dynamic range image is synthesized into the high dynamic range image, the multi-frame low dynamic range image with different exposure degrees needs to be shot for the same scene. In capturing multiple frames of low dynamic range images using the electronic device 100, the positions of the same object in the multiple frames of low dynamic range images may be different due to the presence of jitter or movement of the object by the electronic device 100. Thus, content that is not in the photographed scene may appear in the synthesized high dynamic range image, for example, only one target object in the photographed scene, but all or part of the plurality of target objects may appear in the synthesized high dynamic range image. A phenomenon in which content that is not present in a shooting scene appears in a high dynamic range image may be referred to as a "ghost" phenomenon.
Illustratively, in one frame of low dynamic range image as shown in fig. 2A, the target object 201 is in rapid movement. After the photographing time of photographing one frame of the low dynamic range image as shown in fig. 2A, another frame of the low dynamic range image as shown in fig. 2B is photographed, in which the target object 201 is moving rapidly. Blending the low dynamic range image shown in fig. 2A with the low dynamic range image shown in fig. 2B into the high dynamic range image shown in fig. 2C may occur in the high dynamic range image shown in fig. 2C as a ghost 202.
Therefore, the embodiment of the scheme provides an image processing method. The electronic device 100 may acquire multi-frame low dynamic range images of different exposure levels; the electronic device 100 may determine a reference image from the multi-frame low dynamic range image; the electronic device 100 may determine optical flow information between images adjacent in shooting time in the multi-frame low dynamic range image; the electronic device 100 may infer a position of the target object in a plurality of non-reference images in the plurality of low dynamic range images when the reference image corresponds to the photographing time based on optical flow information between images adjacent to the photographing time in the plurality of low dynamic range images; the electronic device 100 may move the target object in the non-reference image to a position when the reference image corresponds to the shooting time, so as to obtain a multi-frame presumption image; the electronic device 100 may determine a first key image and a second key image from the multiframe predictive image; the electronic device 100 may fuse the first key image, the second key image, and the reference image into a high dynamic range image. Thus, the electronic device 100 can make the synthesized high dynamic range image more natural and of higher quality when synthesizing the high dynamic range image from the multi-frame low dynamic range image.
An image processing method provided by the embodiment of the application is described below.
Fig. 3 is a schematic flow chart of an image processing method according to an embodiment of the present application.
S301, acquiring multi-frame low dynamic range images with different exposure degrees according to a shooting time sequence through a camera.
The multi-frame low dynamic range images with different exposure degrees are used for indicating multi-frame low dynamic range images with different brightness obtained by adjusting exposure parameters (such as shutter speed, aperture size, ISO sensitivity, etc.) of the camera in the process of shooting by the camera through the electronic device 100. The low dynamic range image may be as shown with reference to fig. 2A and 2B described above.
S302, determining a reference image from the multi-frame low dynamic range image.
The reference image may be one frame of image with the most uniform brightness distribution in the multi-frame low dynamic range image, or the reference image may be one frame of image with the most detailed information in the multi-frame low dynamic range image. In this way, at the time of the subsequent image processing, the calculation amount of the image processing can be reduced.
S303, determining optical flow information between images adjacent to shooting time in the multi-frame low dynamic range image.
The electronic device 100 may determine optical flow information between images adjacent in time of capture in the plurality of frames of low dynamic range images based on an optical flow algorithm. The optical flow information may be used to indicate information of a motion vector of a target object over a time series between images adjacent in shooting time in a plurality of frames of low dynamic range images. The information of the motion vector may include information of a position, a motion direction, and a motion speed of the target object, among others. For example, the motion vector of the target object in one frame image may be [ [,,,Of [ wherein ],Pixel coordinates for indicating a target object [,For indicating the speed and direction of movement of the target object.
The optical flow algorithm may include: optical flow algorithms such as pyramid optical flow method, lucas-Kanade (Lucas-Kanade) method and neural network-based optical flow method (FlowNet/FlowNet 2.0).
In this way, the electronic device 100 may determine optical flow information between any two frames of images in the multiple frames of low dynamic range images, including the position, the moving direction, the moving speed, and the like of the target object, through optical flow information between images with adjacent shooting times in the multiple frames of low dynamic range images, so as to facilitate subsequent image processing.
S304, based on the optical flow information between the images adjacent to the shooting time in the multi-frame low dynamic range, the position of the target object in the multi-frame non-reference image in the multi-frame low dynamic range image when the reference image corresponds to the shooting time is estimated, and the non-reference image is different from the reference image.
Illustratively, the reference image shown in fig. 4A includes the target object 201 and the shooting background 203, where the reference image shown in fig. 4A has unclear detailed information. The non-reference image shown in fig. 4B includes a target object 201 and a photographing background 203. The electronic apparatus 100 may estimate the position of the target object 201 in the non-reference image shown in fig. 4B at the time when the reference image shown in fig. 4A corresponds to the photographing time (for example, the target object 204 in the non-reference image shown in fig. 4B) based on the optical flow information between the non-reference image shown in fig. 4B and the reference image shown in fig. 4A.
S305, moving the target object in the non-reference image to the position when the reference image corresponds to the shooting time based on the position of the target object in the multi-frame non-reference image in the multi-frame low dynamic range image, and obtaining a multi-frame estimated image.
Illustratively, the estimated image shown in fig. 4C is a non-reference image in which the target object 201 shown in fig. 4B is moved to a position (e.g., the target object 205 in the estimated image shown in fig. 4C) at which the reference image shown in fig. 4A corresponds to the photographing time, and details of the position (e.g., the hollow region 206 in the estimated image shown in fig. 4C) of the original target object 201 in the estimated image shown in fig. 4C are missing.
S306, determining a first key image and a second key image from the multi-frame presumption image, wherein the position of the target object in the non-reference image corresponding to the first key image is not overlapped with the position of the target object in the non-reference frame corresponding to the second key image.
Illustratively, the first key image shown in fig. 5A includes a shooting background 203, a target object 205, and a cavity area 206. Wherein, the information in the hole area 206 in the first key image shown in fig. 5A is missing. The hole region 206 in the first key image shown in fig. 5A corresponds to the position of the target object in the non-reference image corresponding to the first key image. The second key image shown in fig. 5B includes a photographing background 203, a target object 205, and a hole area 206. Wherein, the information in the hole area 206 in the second key image shown in fig. 5B is missing. The hole region 206 in the second key image shown in fig. 5B corresponds to the position of the target object in the non-reference image corresponding to the second key image. The first key image void region 206 shown in fig. 5A does not overlap with the second key image void region 206 shown in fig. 5B.
S307, fusing the first key image, the second key image and the reference image into a high dynamic range image, wherein the brightness range of the high dynamic range image is wider than the brightness range of the low dynamic range image.
For example, as shown in fig. 6, the reference image may include a target object 201, a photographing background 203. Wherein a portion of the information in the reference image is not clear (e.g., the region indicated by 207 of the reference image). The first key image includes a photographing background 203, a target object 205, and a hole area 206. Wherein the detail information in the hollow region 206 in the first key image is missing. The second key image includes a photographing background 203, a target object 205, and a hole area 206. Wherein the detail information in the hollow region 206 in the second key image is missing. The electronic device 100 may fill in and repair the unclear detailed information in the reference image by using the clear detailed information in the first key image and/or the clear detailed information in the second key image, to obtain a high dynamic range image.
In one possible implementation, the electronic device 100 may blend the first key image or the second key image with the reference image into a high dynamic range image.
In this way, the electronic device 100 fuses the first key image, the second key image, and the reference image into a high dynamic range image that is more natural and of higher quality.
In some embodiments, the electronic device 100 may acquire a plurality of frames of low dynamic range images through the camera according to a photographing time sequence, where the plurality of frames of low dynamic range images include a plurality of frames of high exposure images and a plurality of frames of low exposure images, and the exposure degree of the high exposure images is greater than the exposure degree of the low exposure images.
Wherein underexposure of the low-exposure image (e.g., the camera of the electronic device 100 receives a smaller amount of light) results in the low-exposure image being too dark and the details being not clear enough; overexposure of the high exposure image (e.g., an excessive amount of light received by the camera of electronic device 100) results in the high exposure image being over-bright and the bright portion being distorted.
In one possible implementation, the electronic device 100 may acquire multiple frames of low dynamic range images using a method that alternately acquires low exposure images and high exposure images. For example, the electronic device 100 may acquire one frame of low-exposure image, one frame of high-exposure image, one frame of low-exposure image, one frame of high-exposure image … …, and multiple frames of high-exposure image and multiple frames of low-exposure image. The low exposure image may be represented by identifiers S1, S2, S3, … …, a first frame of low exposure image (e.g., identifier S1), a second frame of low exposure image (e.g., identifier S2), a third frame of low exposure image (e.g., identifier S3), and so on. The high exposure image may be represented by identifiers H1, H2, H3, … …, a first frame of high exposure image acquired (e.g., identifier H1), a second frame of high exposure image (e.g., identifier H2), a third frame of high exposure image (e.g., identifier H3), and so on. Thus, by alternately acquiring the low exposure image and the high exposure image, a plurality of frames of low dynamic range images of different exposure degrees are acquired, and a sufficient time interval can be provided between adjacent low exposure images or adjacent high exposure images.
In one possible implementation, the electronic device 100 may acquire multiple frames of high exposure images before acquiring multiple frames of low exposure images.
In one possible implementation, the electronic device 100 may acquire multiple frames of low exposure images before acquiring multiple frames of high exposure images.
The electronic device 100 may determine an exposure value of each of the plurality of frames of low dynamic range images, and determine a lowest exposure image having a lowest exposure value from the plurality of frames of low dynamic range images. The exposure value is used to indicate the total amount of light received by the electronic device 100 when capturing multiple frames of low dynamic range images. The electronic apparatus 100 may calculate an exposure value of each frame of the low dynamic range image based on the shutter speed, the aperture, and the sensitivity (ISO). For example, when a low dynamic range image is captured, the shutter speed is 1/100 seconds, the aperture is f/2.8 (where f represents the focal length, and 2.8 represents the relative aperture value of the aperture), and ISO is 200. The electronic device 100 may determine values of shutter speed, aperture, and ISO to an Exposure Value (EV). For example, the shutter speed is 0 at an EV value corresponding to 1/100 seconds, the aperture f/2.8 is 0 at an EV value corresponding to 0, and the ISO sensitivity 200 is 2 at an EV value corresponding to 2. The electronic apparatus 100 may add the EV values of the shutter speed, the aperture, and the ISO sensitivity to obtain the exposure value of 2 for the low dynamic range image. In this way, when the brightness of the high exposure image is adjusted based on the lowest exposure image later, it is possible to ensure that the brightness of the high exposure image after the brightness is adjusted is not too dark as much as possible.
The electronic device 100 may divide the exposure value of the multi-frame high exposure image by the exposure value of the lowest exposure image such that the electronic device 100 performs brightness alignment of the multi-frame high exposure image to the lowest exposure image. For example, the electronic device 100 may divide the exposure value of one frame of the high exposure image (for example, the exposure value is 8) by the exposure value of the lowest exposure image (for example, the exposure value is 2), to obtain the exposure value of the high exposure image with aligned brightness (for example, the exposure value is 4). In this way, by adjusting the exposure value of the high exposure image, the high exposure image and the low exposure image can be as close as possible in brightness, so that the subsequent image fusion and synthesis can be better performed.
In one possible implementation, the electronic device 100 may use any one of the multiple frames of high exposure images with aligned brightness as the reference image.
In one possible implementation, the electronic device 100 may use one frame with the greatest detail information in the multi-frame high exposure image with aligned brightness as the reference image. Thus, the fusion of the subsequent images can be facilitated.
The electronic device 100 may determine a correspondence matrix between each of the plurality of frames of low-exposure images and the reference image and integrally align the plurality of frames of low-exposure images with the reference image based on the correspondence matrix between each of the plurality of frames of low-exposure images and the reference image. The consistency transformation matrix is used for indicating the geometric transformation relation between each frame of low exposure image in the multi-frame low exposure image and the reference image. In this way, by aligning each of the plurality of frames of low exposure images with the reference image, each of the plurality of frames of low exposure images is spatially aligned with the reference image, and subsequent image processing can be performed better.
The electronic device 100 may determine a correspondence matrix between each of the plurality of frames of high-exposure images after the brightness alignment and the reference image, and integrally align the plurality of frames of high-exposure images after the brightness alignment with the reference image based on the correspondence matrix between each of the plurality of frames of high-exposure images and the reference image. In this way, by aligning each of the plurality of high-exposure images after the brightness alignment with the reference image, each of the plurality of high-exposure images after the brightness alignment is spatially coincident with the reference image, the subsequent image processing can be performed better.
Illustratively, in the reference image shown in fig. 7A, the photographing background 203 is at the middle position of the image, and the target object 201 is in motion. In the image shown in fig. 7B, which is to be aligned integrally with the reference image, the photographing background 203 is on top of the image, and the target object 201 is in motion. The electronic device 100 may determine a consistency transformation matrix between the image shown in fig. 7B that is ready for global alignment with the reference image and the reference image shown in fig. 7A. The electronic device 100 may move the photographing background 203 in the image to be aligned integrally with the reference image shown in fig. 7B to an intermediate position of the image based on the consistency transformation matrix, to be aligned integrally with the reference image shown in fig. 7A, resulting in an aligned integrally image as shown in fig. 7C. In the integrally aligned image shown in fig. 7C, the photographing background 203 is at the middle position of the image.
The electronic device 100 may determine optical flow information between the plurality of low-exposure images after the overall alignment and the photographing time-adjacent images among the plurality of high-exposure images after the brightness alignment and the overall alignment based on the optical flow algorithm. The optical flow information includes a plurality of frames of low-exposure images after the overall alignment and forward optical flow and backward optical flow between images adjacent in shooting time among the plurality of frames of high-exposure images after the luminance alignment and the overall alignment. The optical flow information may be used for information indicating a motion vector of a target object in time series between images adjacent in shooting time among a plurality of frames of low-exposure images after the overall alignment and a plurality of frames of high-exposure images after the luminance alignment and the overall alignment.
In this way, the electronic device 100 may determine, based on the optical flow information between the multiple frames of low-exposure images after the overall alignment and the multiple frames of high-exposure images after the brightness alignment and the overall alignment, optical flow information between any two frames of images in the multiple frames of low-exposure images after the overall alignment and the multiple frames of high-exposure images after the brightness alignment and the overall alignment, including the position, the moving direction, the moving speed, and the like of the target object, so as to facilitate subsequent image processing.
The electronic device 100 may infer a position of the target object in the multi-frame non-reference image when the reference image corresponds to the photographing time based on the multi-frame low-exposure image after the overall alignment and the optical flow information between the images adjacent to the photographing time in the multi-frame high-exposure image after the brightness alignment and the overall alignment; wherein the multi-frame non-reference image includes a multi-frame low exposure image after the overall alignment and a multi-frame high exposure image after the brightness alignment and the overall alignment.
In one possible implementation manner, based on optical flow information between adjacent images of shooting time in a plurality of frames of low exposure images after integral alignment and a plurality of frames of high exposure images after brightness alignment and integral alignment, a position of a target object in a plurality of frames of non-reference images when the reference image corresponds to the shooting time is estimated through a Kalman filtering algorithm. Therefore, the position of the target object in the multi-frame non-reference image is estimated to be more accurate when the reference image corresponds to the shooting time through the Kalman filtering algorithm.
The electronic device 100 may move the target object in the non-reference image to the position at the shooting time corresponding to the reference image based on the position of the target object in the multi-frame non-reference image in the multi-frame low dynamic range image at the shooting time corresponding to the reference image, so as to obtain the multi-frame presumption image.
The electronic device 100 may determine a first key image and a second key image from the multiple frame of inferred images, where a position of the target object in the non-reference image corresponding to the first key image does not overlap a position of the target object in the non-reference frame corresponding to the second key image.
The electronic device 100 may fuse the first key image, the second key image, and the reference image into a high dynamic range image.
In one possible implementation, the electronic device 100 may blend the first key image or the second key image with the reference image into a high dynamic range image.
In this way, the electronic device 100 fuses the first key image, the second key image, and the reference image into a high dynamic range image that is more natural and of higher quality.
In one possible implementation, after the electronic device 100 integrally aligns the multi-frame low-exposure image and the multi-frame high-exposure image after the brightness alignment with the reference image, the electronic device 100 may directly determine the key image a and the key image B from the integrally aligned multi-frame low-exposure image. Then, the electronic device 100 may infer the positions of the target object in the key image a and the key image B when the reference image corresponds to the photographing time, based on the optical flow information between the images adjacent to the photographing time in the plurality of low-exposure images after the overall alignment and the plurality of high-exposure images after the brightness alignment and the overall alignment. The electronic device 100 may move the target object in the key image a to a position when the reference image corresponds to the photographing time, to obtain the first key image. The electronic device 100 may move the target object in the key image B to a position when the reference image corresponds to the photographing time, to obtain a second key image. The electronic device 100 may fuse the first key image, the second key image, and the reference image into a high dynamic range image.
In some embodiments, the electronic device 100 may obtain multiple frames of low dynamic range images with different exposure degrees according to the photographing time sequence through the camera, where the multiple frames of low dynamic range images with different exposure degrees include multiple frames of low dynamic range images with K exposure degrees, where K is a positive integer greater than 2.
The electronic device 100 may determine a luminance distribution histogram of the multi-frame low dynamic range image. Wherein the luminance distribution histogram is used to indicate the distribution of the luminance level and the number of pixels of the low dynamic range image. Illustratively, in the luminance distribution histogram shown in fig. 8, the horizontal axis of the luminance distribution histogram may represent the luminance level of the image as 0 to 255, and the vertical axis of the luminance distribution histogram may represent the number of pixels of the image at the corresponding luminance level. The number of pixels in the image with a brightness level of 70 is 10 tens of thousands of pixels. Thus, the reference image selected by the brightness distribution histogram has better exposure condition.
The electronic device 100 may determine the reference image with the most uniform luminance distribution from the multiple frames of low dynamic range images based on the luminance distribution histogram of the multiple frames of low dynamic range images.
The electronic device 100 may determine the exposure value of the multiple frames of low dynamic range images, and determine the lowest exposure image with the lowest exposure value from the multiple frames of low dynamic range images.
The electronic device 100 may divide the exposure value of the plurality of non-lowest exposure images in the plurality of low dynamic range images by the exposure value of the lowest exposure image, so that the electronic device 100 may perform brightness alignment on the plurality of low dynamic range images. Wherein the plurality of non-minimum exposure images includes the reference image, the plurality of non-minimum exposure images being different from the minimum exposure image.
The electronic device 100 may determine a correspondence matrix between the lowest exposure image and the luminance-aligned multi-frame non-lowest exposure image and the reference image, and integrally align the lowest exposure image and the luminance-aligned multi-frame non-lowest exposure image with the reference image based on the correspondence matrix between the lowest exposure image and the luminance-aligned multi-frame non-lowest exposure image and the reference image.
The electronic device 100 may determine the lowest exposure image after the overall alignment and optical flow information between the photographing time adjacent images among the luminance alignment and the multi-frame non-lowest exposure image after the overall alignment.
The electronic device 100 may infer a position of the target object in the multi-frame non-reference image when the reference image corresponds to the photographing time based on the lowest exposure image after the overall alignment and optical flow information between the images adjacent to the photographing time in the multi-frame non-lowest exposure image after the brightness alignment and the overall alignment, so as to obtain a multi-frame inferred image; wherein the multi-frame non-reference image includes a lowest exposure image after the global alignment and a multi-frame non-lowest exposure image after the brightness alignment and the global alignment.
In one possible implementation manner, the electronic device 100 may infer, by using a kalman filtering algorithm, a position of the target object in the multiple frames of non-reference images when the reference image corresponds to the photographing time based on the lowest exposure image after the overall alignment and optical flow information between the images adjacent to the photographing time in the multiple frames of non-lowest exposure images after the brightness alignment and the overall alignment, so as to obtain multiple frames of inferred images.
The electronic device 100 may determine a first key image and a second key image from the multiple frame of inferred images, where a position of the target object in the non-reference image corresponding to the first key image does not overlap a position of the target object in the non-reference frame corresponding to the second key image.
The electronic device 100 may fuse the first key image, the second key image, and the reference image into a high dynamic range image.
In this way, the electronic device 100 fuses the first key image, the second key image, and the reference image into a high dynamic range image that is more natural and of higher quality.
In one possible implementation, after the electronic device 100 may integrally align the minimum exposure image with the luminance aligned multi-frame non-minimum exposure image with the reference image, the electronic device 100 may directly determine the key image a and the key image B from the luminance aligned and integrally aligned multi-frame non-minimum exposure image. Then, the electronic device 100 may infer the positions of the target object in the key image a and the key image B when the reference image corresponds to the photographing time, based on the lowest exposure image after the overall alignment and the optical flow information between the images adjacent to the photographing time in the multi-frame non-lowest exposure image after the brightness alignment and the overall alignment. The electronic device 100 may move the target object in the key image a to a position when the reference image corresponds to the photographing time, to obtain the first key image. The electronic device 100 may move the target object in the key image B to a position when the reference image corresponds to the photographing time, to obtain a second key image. The electronic device 100 may fuse the first key image, the second key image, and the reference image into a high dynamic range image.
The embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor, implements the steps of the above-described method embodiments.
Embodiments of the present application also provide a computer program product enabling an electronic device to carry out the steps of the various method embodiments described above, when the computer program product is run on the electronic device.
The embodiment of the application also provides a chip system, which comprises a processor, wherein the processor is coupled with the memory, and the processor executes a computer program stored in the memory to realize the steps of any method embodiment of the application. The chip system can be a single chip or a chip module composed of a plurality of chips.
The term "User Interface (UI)" in the description and drawings of the present application is a medium interface for interaction and information exchange between an application program or an operating system and a user, which realizes conversion between an internal form of information and an acceptable form of the user. The user interface of the application program is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the terminal equipment, and finally the interface source code is presented as content which can be identified by a user, such as a control of pictures, words, buttons and the like. Controls (controls), also known as parts (widgets), are basic elements of a user interface, typical controls being a toolbar (toolbar), menu bar (menu bar), text box (text box), button (button), scroll bar (scrollbar), picture and text. The properties and content of the controls in the interface are defined by labels or nodes, such as XML specifying the controls contained in the interface by nodes < Textview >, < ImgView >, < VideoView >, etc. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page, also referred to as a page, is understood to be a special control embedded in an application interface, the web page being source code written in a specific computer language, such as hypertext markup language (hyper text markup language, HTML), cascading style sheets (CASCADING STYLE SHEETS, CSS), java script (JavaScript, JS), etc., the web page source code being loadable and displayable as user identifiable content by a browser or web page display component similar to the browser functionality. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as HTML defines the elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
Claims (10)
1. An image processing method, applied to an electronic device, the electronic device including a camera, the method comprising:
Acquiring multiple frames of low-dynamic-range images with different exposure degrees according to a shooting time sequence through the camera, wherein the multiple frames of low-dynamic-range images comprise multiple frames of high-exposure images and multiple frames of low-exposure images, the exposure degree of the high-exposure images is larger than that of the low-exposure images, and the high-exposure images and the low-exposure images are alternately acquired;
determining a reference image from the multi-frame low dynamic range image;
determining optical flow information between images adjacent to shooting time in the multi-frame low dynamic range image;
based on the optical flow information between the images adjacent to the shooting time in the multi-frame low dynamic range image, the position of a moving target object in a multi-frame non-reference image in the multi-frame low dynamic range image when the reference image corresponds to the shooting time is estimated, wherein the non-reference image is different from the reference image;
based on the position of the target object in a plurality of frames of non-reference images in the plurality of frames of low dynamic range images when the reference image corresponds to the shooting time, moving the target object in the non-reference images to the position when the reference image corresponds to the shooting time, and obtaining a plurality of frames of estimated images;
determining a first key image and a second key image from the multi-frame presumption image, wherein the position of the target object in the non-reference image corresponding to the first key image is not overlapped with the position of the target object in the non-reference image corresponding to the second key image;
And fusing the first key image, the second key image and the reference image into a high dynamic range image, wherein the brightness range of the high dynamic range image is wider than that of the low dynamic range image.
2. The method according to claim 1, wherein the method further comprises:
Determining exposure values of the multi-frame low dynamic range image;
determining the lowest exposure image with the lowest exposure value from the multi-frame low dynamic range image;
brightness alignment is carried out on the multi-frame high exposure image to the lowest exposure image;
The method for determining the reference image from the multi-frame low dynamic range image specifically comprises the following steps:
and determining the reference image from the multi-frame high exposure image with aligned brightness.
3. The method of claim 2, wherein after said determining said reference image from said multi-frame high exposure image with aligned brightness, said method further comprises:
And integrally aligning the multi-frame low-exposure image and the multi-frame high-exposure image with the brightness aligned to the reference image.
4. The method according to claim 3, wherein determining optical flow information between images adjacent to a capturing time in the multi-frame low dynamic range image specifically includes:
And determining optical flow information between the multi-frame low-exposure images after the integral alignment and images adjacent to shooting time in the multi-frame high-exposure images after the brightness alignment and the integral alignment.
5. The method according to claim 4, wherein the estimating the position of the target object in the multi-frame non-reference image in the multi-frame low dynamic range image when the reference image corresponds to the photographing time based on the optical flow information between the images adjacent to the photographing time in the multi-frame low dynamic range image specifically includes:
based on the multi-frame low exposure images after integral alignment and optical flow information between the images adjacent to the shooting time in the multi-frame high exposure images after brightness alignment and integral alignment, the position of the target object in the multi-frame non-reference image when the reference image corresponds to the shooting time is estimated; wherein the multi-frame non-reference image includes the multi-frame low-exposure image after the overall alignment and the multi-frame high-exposure image after the brightness alignment and the overall alignment.
6. The method according to claim 5, wherein the estimating the position of the target object in the multi-frame non-reference image when the reference image corresponds to the photographing time based on the optical flow information between the multi-frame low-exposure image after the overall alignment and the images adjacent to the photographing time in the multi-frame high-exposure image after the brightness alignment and the overall alignment specifically includes:
And based on the multi-frame low exposure images after the integral alignment and the optical flow information between the images adjacent to the shooting time in the multi-frame high exposure images after the brightness alignment and the integral alignment, the position of the target object in the multi-frame non-reference image when the reference image corresponds to the shooting time is estimated through a Kalman filtering algorithm.
7. The method according to claim 2, wherein said brightness alignment of said multi-frame high exposure image to said lowest exposure image, in particular comprises:
dividing the exposure value of the multi-frame high exposure image by the exposure value of the lowest exposure image.
8. An electronic device comprising a camera, one or more processors, and one or more memories; wherein the camera, the one or more memories are coupled to the one or more processors, the one or more memories for storing computer instructions that, when executed by the one or more processors, cause the electronic device to perform the image processing method of any of claims 1-7.
9. A chip for application to an electronic device, the chip comprising processing circuitry and interface circuitry, the interface circuitry for receiving instructions and transmitting to the processing circuitry, the processing circuitry for executing the instructions to perform the image processing method of any of claims 1-7.
10. A computer readable storage medium comprising instructions which, when run on a processor of an electronic device, cause the electronic device to perform the image processing method of any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410217098.XA CN117793544B (en) | 2024-02-28 | 2024-02-28 | Image processing method and related device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410217098.XA CN117793544B (en) | 2024-02-28 | 2024-02-28 | Image processing method and related device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117793544A CN117793544A (en) | 2024-03-29 |
CN117793544B true CN117793544B (en) | 2024-10-15 |
Family
ID=90383743
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410217098.XA Active CN117793544B (en) | 2024-02-28 | 2024-02-28 | Image processing method and related device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117793544B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2515273A1 (en) * | 2011-04-20 | 2012-10-24 | CSR Technology Inc. | Multiple exposure high dynamic range image capture |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9489706B2 (en) * | 2012-07-02 | 2016-11-08 | Qualcomm Technologies, Inc. | Device and algorithm for capturing high dynamic range (HDR) video |
KR102584187B1 (en) * | 2016-03-30 | 2023-10-05 | 삼성전자주식회사 | Electronic device and method for processing image |
WO2019001701A1 (en) * | 2017-06-28 | 2019-01-03 | Huawei Technologies Co., Ltd. | Image processing apparatus and method |
CN107277387B (en) * | 2017-07-26 | 2019-11-05 | 维沃移动通信有限公司 | High dynamic range images image pickup method, terminal and computer readable storage medium |
US11113801B1 (en) * | 2018-09-11 | 2021-09-07 | Apple Inc. | Robust image motion detection using scene analysis and image frame pairs |
-
2024
- 2024-02-28 CN CN202410217098.XA patent/CN117793544B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2515273A1 (en) * | 2011-04-20 | 2012-10-24 | CSR Technology Inc. | Multiple exposure high dynamic range image capture |
Also Published As
Publication number | Publication date |
---|---|
CN117793544A (en) | 2024-03-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115473957B (en) | Image processing method and electronic equipment | |
CN113810598B (en) | Photographing method, electronic device and storage medium | |
WO2021036991A1 (en) | High dynamic range video generation method and device | |
WO2021052111A1 (en) | Image processing method and electronic device | |
US20240119566A1 (en) | Image processing method and apparatus, and electronic device | |
CN115359105B (en) | Depth-of-field extended image generation method, device and storage medium | |
CN117793544B (en) | Image processing method and related device | |
CN116723410B (en) | Method and device for adjusting frame interval | |
CN115460343B (en) | Image processing method, device and storage medium | |
CN116723382B (en) | Shooting method and related equipment | |
CN116709016B (en) | Multiplying power switching method and multiplying power switching device | |
CN114363482B (en) | Method for determining calibration image and electronic equipment | |
CN117880645A (en) | Image processing method and device, electronic equipment and storage medium | |
CN118552452A (en) | Method for removing moire and related device | |
CN116095225B (en) | Image processing method and device of terminal equipment | |
CN116723264B (en) | Method, apparatus and storage medium for determining target location information | |
CN118450269B (en) | Image processing method and electronic device | |
CN117479008B (en) | Video processing method, electronic equipment and chip system | |
CN117956299B (en) | Moon shooting method and electronic equipment | |
CN117714860B (en) | Image processing method and electronic equipment | |
CN116723416B (en) | Image processing method and electronic equipment | |
CN117692693B (en) | Multi-screen display method, device, program product and storage medium | |
CN118450239B (en) | Frequency modulation method and related device | |
CN117724603B (en) | Interface display method and electronic device | |
CN118447071B (en) | Image processing method and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |