Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein, but rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the exemplary embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the disclosed aspects may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
In the related art, color migration of an image refers to modifying a color style of an input image with a color of a reference image to obtain a migrated image having the same shape as the input image and the same color style as the reference image.
Based on the problems existing in the related art, the embodiment of the present disclosure first provides an information acquisition method, which is applied to the system architecture of the exemplary embodiment of the present disclosure. Fig. 1 shows a schematic diagram of a system architecture of an exemplary embodiment of the present disclosure, as shown in fig. 1, the system architecture 100 may include a terminal 110, a network 120, and a server 130. The terminal 110 may be various electronic devices with audio acquisition functions including, but not limited to, a cell phone, a tablet computer, a personal computer, a smart wearable device, etc. The medium used by network 120 to provide a communication link between terminal 110 and server 130 may include various connection types, such as wired, wireless communication links, or fiber optic cables. It should be understood that the number of terminals, networks and servers in fig. 1 is merely illustrative. There may be any number of terminals, networks, and servers, as desired for implementation. For example, the server 130 may be a server cluster formed by a plurality of servers.
The information acquisition method provided by the embodiment of the present disclosure may be performed by the terminal 110, for example, acquiring a migration image and a reference image at the terminal 110, and acquiring target migration result information according to the migration image and the reference image.
In addition, the information obtaining method provided in the embodiment of the present disclosure may also be performed by the server 130, for example, after the terminal 110 obtains the migration image and the reference image, the migration image and the reference image are uploaded to the server 130, so that the server 130 obtains the target migration result information of the migration image according to the migration image and the reference image, and returns the target migration result information to the terminal 110, which is not limited in the present disclosure.
Exemplary embodiments of the present disclosure provide an electronic device for implementing an information acquisition method, which may be the terminal 110 or the server 130 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the information acquisition method via execution of the executable instructions.
The electronic device may be implemented in various forms, and may include mobile devices such as a mobile phone, a tablet computer, a notebook computer, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), a navigation device, a wearable device, a drone, and fixed devices such as a desktop computer and a smart television.
The configuration of the electronic device will be exemplarily described below using the mobile terminal 200 of fig. 2 as an example. It will be appreciated by those skilled in the art that the configuration of fig. 2 can also be applied to stationary type devices in addition to components specifically for mobile purposes. In other embodiments, mobile terminal 200 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is shown schematically only and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also employ a different interface from that of fig. 2, or a combination of interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include a processor 210, an internal memory 221, an external memory interface 222, a USB interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 271, a receiver 272, a microphone 273, an earphone interface 274, a sensor module 280, a display 290, an image pickup module 291, an indicator 292, a motor 293, keys 294, a subscriber identity module (Subscriber Identification Module, SIM) card interface 295, and the like. The sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, a barometric pressure sensor 2804, and the like.
Processor 210 may include one or more processing units, for example, processor 210 may include an application Processor (Application Processor, AP), a modem Processor, a graphics Processor (Graphics Processing Unit, GPU), an image signal Processor (IMAGE SIGNAL Processor, ISP), a controller, a video codec, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), a baseband Processor and/or a neural network Processor (Neural-Network Processing Unit, NPU), and the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The NPU is a neural Network (Neural-Network, NN) computing processor, and can rapidly process input information by referencing a biological neural Network structure, such as referencing a transmission mode among human brain neurons, and can continuously learn. Applications such as intelligent recognition of the mobile terminal 200, for example, image recognition, face recognition, voice recognition, text understanding, etc., can be realized through the NPU.
The processor 210 has a memory disposed therein. The memory may store instructions for implementing six modular functions, detection instructions, connection instructions, information management instructions, analysis instructions, data transfer instructions, and notification instructions, and is controlled to be executed by the processor 210.
The charge management module 240 is configured to receive a charge input from a charger. The power management module 241 is used for connecting the battery 242, the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 221, the display 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein the antennas 1 and 2 are used to transmit and receive electromagnetic wave signals, the mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied to the mobile terminal 200, the modem processor may include a modulator and demodulator, and the wireless communication module 260 may provide a solution including wireless local area network (Wireless Local Area Networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT) wireless communication applied to the mobile terminal 200. In some embodiments, antenna 1 and mobile communication module 250 of mobile terminal 200 are coupled, and antenna 2 and wireless communication module 260 are coupled, so that mobile terminal 200 may communicate with a network and other devices through wireless communication techniques.
The mobile terminal 200 implements display functions through a GPU, a display screen 290, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The mobile terminal 200 may implement a photographing function through an ISP, a camera module 291, a video codec, a GPU, a display screen 290, an application processor, and the like. The ISP is used for processing data fed back by the camera module 291, the camera module 291 is used for capturing still images or videos, the digital signal processor is used for processing digital signals, processing other digital signals besides digital image signals, the video codec is used for compressing or decompressing digital videos, and the mobile terminal 200 can also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the mobile terminal 200. The external memory card communicates with the processor 210 via an external memory interface 222 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 221 may be used to store computer executable program code that includes instructions. The internal memory 221 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (Universal Flash Storage, UFS), and the like. The processor 210 performs various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement audio functions through an audio module 270, a speaker 271, a receiver 272, a microphone 273, an earphone interface 274, an application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided at the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 2802 may be disposed on display 290. The pressure sensor 2802 is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 2803. The gyro sensor 2803 can be used to capture anti-shake, navigation, motion-sensing game scenes, and the like.
In addition, sensors for other functions, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices that provide auxiliary functionality may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, etc., by which a user can generate key signal inputs related to user settings and function controls of the mobile terminal 200. As another example, indicator 292, motor 293, SIM card interface 295, and the like.
An information acquisition method and an information acquisition apparatus according to exemplary embodiments of the present disclosure are specifically described below. Fig. 3 shows a flow diagram of an information acquisition method, as shown in fig. 3, which at least comprises the following steps:
Step S310, acquiring a migration image and a reference image, wherein the migration image is an image obtained after color migration of a target image according to the reference image;
step S320, projecting the reference image and the migration image by using the projection vector to obtain a reference projection image corresponding to the reference image and a migration projection image corresponding to the migration image;
And step S330, carrying out histogram matching on the reference projection diagram and the migration projection diagram to obtain a projection matching diagram, and acquiring target migration result information of the migration image according to the projection matching diagram and the migration projection diagram.
According to the information acquisition method, the migration image and the reference image are projected to a single channel through projection vectors, histogram matching is directly carried out on the reference projection image and the migration projection image of the single channel, color probability distribution of the image is obtained, target migration result information is acquired based on the color probability distribution, and accuracy of the target migration result information is improved.
In order to make the technical solution of the present disclosure clearer, each step of the information acquisition method is described next.
In step S310, a shift image and a reference image are acquired, the shift image being an image obtained after color shifting of the target image according to the reference image.
In an exemplary embodiment of the present disclosure, the target image refers to an image to be processed for color migration. The target image may be an image photographed by the image collection unit, and may be an image drawn by the image editing software, but of course, may be other types of images to be processed designated for color migration, which is not particularly limited in this exemplary embodiment.
The reference image refers to a source image for providing texture information and color information used for color migration, and may be any one image. For example, the reference image may be an image having a cool-warm tone or an image having a painting style, which is not particularly limited in this exemplary embodiment.
The color migration refers to extracting color features from a specified reference image, and performing color migration on the content structure of the target image by using the extracted color features on the premise of not damaging the content structure of the target image so as to obtain a migration image. The obtained migration image has the structure information and the shape information of the target image, and also has the texture information and the color information of the reference image.
For example, color migration is performed on the target image B based on the reference image a to obtain a migrated image C having texture information and color information similar to or the same as those of the reference image a, and having structure information and shape information similar to or the same as those of the target image B.
In step S320, the reference image and the transition image are projected using the projection vector to obtain a reference projection map corresponding to the reference image and a transition projection map corresponding to the transition image.
In an exemplary embodiment of the present disclosure, a projection vector may be preconfigured, and the reference image and the migration image may be projected using the projection vector. The projection vector may be a three-dimensional projection vector (x, y, z), where x, y, z may be any number, and the present exemplary embodiment is not limited in particular.
Specifically, the RGB pixel values of each pixel in the reference image and the migrated image are projected to a single dimension according to the projection vector (x, y, z) to obtain a pixel value of the single dimension corresponding to each pixel. And respectively constructing a reference projection image and a migration projection image corresponding to the reference image and the migration image according to the pixel values of the single dimension corresponding to each pixel in the reference image and the migration image. Wherein the reference projection map and the migration projection map are single-channel images.
For example, if the pixel value on the RGB channel corresponding to a certain pixel point on the reference image is (r, g, b), the pixel point is projected by using the projection vector, and the obtained pixel value in a single dimension is v=r×x+g×y+b×z. And respectively projecting each pixel point in the reference image, and constructing a reference projection graph corresponding to the reference image according to the pixel value of the single dimension corresponding to each pixel point.
In addition, a plurality of projection vectors may be configured, and the reference image and the transition image may be projected using the plurality of projection vectors, respectively, to obtain a plurality of reference projection images and a plurality of transition projection images, respectively. Of course, the number of projection vectors may be 216 or 360, and the number of projection vectors is not particularly limited in the present disclosure.
In step S330, histogram matching is performed on the reference projection map and the migration projection map to obtain a projection matching map, and target migration result information of the migration image is obtained according to the projection matching map and the migration projection map.
In an exemplary embodiment of the present disclosure, the target migration result information may include a target migration result score, and the target migration result score of the migrated image may characterize a migration error of color migrating the target image based on the reference image. For example, the larger the target migration result score in the target migration result information, the larger the migration error of the migration image is, and the migration effect is poor.
In an exemplary embodiment of the present disclosure, histogram matching, also referred to as histogram specification, is an operation of de-transforming a gray distribution of an image in a particular pattern. I.e. matching the histogram of the template image to the original image to obtain a matched image. The original image and the matched image have the same size, and the template image and the matched image have the same probability distribution.
Specifically, the reference projection image is used as a template image in histogram matching, the migration projection image is used as an original image in histogram matching, and histogram matching is performed to obtain a projection matching image. The projected matching pattern has the same probability distribution as the reference projected pattern and the projected matching pattern has the same size as the reference projected pattern.
In addition, if a plurality of reference projection images and a plurality of migration projection images are obtained from a plurality of projection vectors, histogram matching is performed from the reference projection images and the migration projection images corresponding to the same projection vector, so as to obtain a plurality of projection matching images.
In an exemplary embodiment of the present disclosure, a pixel difference between a pixel value of each pixel in a projection matching diagram and a pixel value of a pixel point at the same position in a migration projection diagram is obtained, and target migration result information of a migration image is obtained according to the pixel difference of each pixel.
The absolute value of the pixel difference corresponding to each pixel point in the migration image can be calculated, and the absolute value of the pixel difference corresponding to each pixel point in the migration image is configured as target migration result information of the migration image. The target migration result information of the migration image is the migration error corresponding to each pixel point in the migration image, and if the absolute value of the pixel difference corresponding to a certain pixel point is larger, the larger the migration error of the pixel point is, namely the color migration effect of the pixel point is poorer.
In addition, the sum of absolute values of pixel differences corresponding to all pixel points in the migration image can be calculated, and the sum of absolute values corresponding to all pixel points in the migration image is configured as target migration result information of the migration image. The target migration result information of the migration image is a migration error corresponding to the migration image, and if the sum of absolute values corresponding to all pixel points in the migration image is larger, the color migration effect of the whole migration image is poor.
Further, an average value of absolute values of pixel differences corresponding to all pixel points in the migration image may be calculated, and the average value of absolute values of pixel differences corresponding to all pixel points in the migration image may be configured as target migration result information of the migration image.
In an exemplary embodiment of the present disclosure, if there are a plurality of projection matching graphs, pixel differences between pixel values of each pixel point in the migration projection graph and pixel points at the same position in each projection matching graph are calculated, respectively. And each pixel point in the migration projection graph corresponds to a plurality of pixel differences, and the sum of absolute values of the pixel differences corresponding to each pixel point or the average value of the absolute values of the pixel differences is used as target migration result information corresponding to each pixel point.
In addition, the sum of the target migration result information corresponding to all the pixel points in the migration image or the average value of the target migration result information corresponding to all the pixel points can be used as the target migration result information of the migration image.
For example, if N projection vectors are configured, firstly, N migration projection images and N reference projection images are obtained according to the N projection vectors, and histogram matching is performed on the migration projection images and the reference projection images obtained by using the same projection vector, so as to obtain N projection matching images;
Then, respectively calculating pixel differences between pixel values of each pixel point in each migration projection image and the pixel points at the same position in the corresponding projection matching image to obtain N pixel differences corresponding to each pixel point, and calculating average pixel differences of the N pixel differences corresponding to each pixel point;
And finally, calculating the sum of average pixel differences of all pixel points in the migration image or the average value of average pixel differences corresponding to all pixel points to obtain target migration result information of the migration image.
The error between the migration projection image and the projection matching image may be calculated using the L1 norm, and the error between the migration projection image and the projection matching image may be used as target migration result information of the migration image. That is, the error between the migration projection map E and the projection matching map F is mean (abs (E-F)).
In an exemplary embodiment of the present disclosure, fig. 4 is a schematic flowchart of a method for obtaining target migration result information, and as shown in fig. 4, the flowchart includes at least steps S410 to S430, which are described in detail below:
In step S410, first migration result information of the migration image is obtained according to the projection matching map and the pixel values of the pixels in the migration projection map.
In an exemplary embodiment of the present disclosure, a pixel difference between a pixel value of each pixel in the projection matching diagram and a pixel value of a pixel at the same position in the migration projection diagram may be obtained, and a sum of pixel differences corresponding to each pixel is configured as the first migration result information. The method for acquiring the first migration result information of the migration image according to the pixel values of each pixel in the projection matching diagram and the migration projection diagram is the same as the method for acquiring the migration image to acquire the target migration result information according to the pixel values of each pixel in the projection matching diagram and the migration projection diagram in the above embodiment, and is described in detail in the above embodiment, and will not be described in detail herein.
In step S420, a distribution model of the reference image is constructed according to the pixel information of the reference image, and second migration result information corresponding to the migration image is calculated according to the distribution model.
In an exemplary embodiment of the present disclosure, pixel values of a reference image are input into a gaussian mixture model to obtain a probability distribution model corresponding to the reference image.
Specifically, the gaussian mixture model (Gaussian Mixed Model, GMM) is a model that precisely quantizes things with a gaussian probability density function, decomposing the things into several models formed based on the gaussian probability density function (normal distribution curve). The GMM model may perform a mixed gaussian distribution on the input reference image to obtain a probability distribution model corresponding to the reference image.
The probability distribution models corresponding to the reference images can be built in advance, and the probability distribution models are stored in a database. And when the color migration is carried out on the target image according to the reference image, directly acquiring a probability distribution model corresponding to the reference image in a database.
In an exemplary embodiment of the present disclosure, fig. 5 shows a flowchart of a method for obtaining second migration result information, and as shown in fig. 5, the flowchart at least includes steps S510 to S520, and is described in detail below, in step S510, a pixel value of each pixel in a migration image is input into a probability distribution model to obtain distribution information corresponding to each pixel in the migration image, and in step S520, the distribution information corresponding to each pixel in the migration image is summed to obtain the second migration result information.
Specifically, the distribution information corresponding to each pixel in the migration image is summed, that is, the distribution information corresponding to all the pixels in the migration image is added, and the result obtained by adding is the second migration result information of the migration image. Wherein the size of the second migration result information characterizes a color difference of the migrated image and the reference image. The larger the second migration result information is, the larger the color difference between the migration image and the reference image is, which indicates that the color migration effect of the target image based on the reference image is worse.
In step S430, target migration result information of the migration image is obtained according to the first migration result information and the second migration result information.
In an exemplary embodiment of the present disclosure, after obtaining first migration result information and second migration result information, a weight value is assigned to the first migration result information and the second migration result information, and target migration result information of a migration image is calculated according to the weight values corresponding to the first migration result information and the second migration result information.
Specifically, the weight values of the first migration result information and the second migration result information may be configured as k 1 and k 2, respectively, and then the target migration result information may be obtained according to formula (1), where formula (1) is as follows:
w=k1a+k2b (1)
Wherein a is first migration result information, b is second migration result information, and w is target migration result information. The values of k 1 and k 2 may be set according to the actual scenario, for example, k 1 may be set to 1, k 2 may be set to 10, etc., which is not specifically limited in this disclosure.
In addition, if the first migration result information includes pixel positions of pixels in the migration image and first migration result information corresponding to the pixel positions. The distribution information corresponding to each pixel in the migration image may be configured as second migration result information, and then the above formula (1) is used to obtain the target migration result information corresponding to each pixel in the migration image.
In an exemplary embodiment of the present disclosure, fig. 6 is a schematic flowchart of a method for obtaining target migration result information, and as shown in fig. 6, the flowchart includes at least steps S610 to S630, which are described in detail below:
in step S610, pixel values of pixels in the reference image are input into the gaussian mixture model to obtain a probability distribution model corresponding to the reference image;
In step S620, the pixel values of the pixels in the migrated image are respectively input into the probability distribution model corresponding to the reference image, so as to obtain the distribution information corresponding to the pixels in the migrated image;
In step S630, the distribution information corresponding to all the pixels in the migration image is summed, and the result of the summation is configured as the target migration result information of the migration image.
In the method for acquiring the target migration result information in this embodiment, the probability that pixels in the migration image appear in the reference image is calculated by using the maximum likelihood, and the result of color migration is analyzed from the global angle.
In an exemplary embodiment of the present disclosure, when target migration result information of a migration image does not satisfy a preset migration condition, weighted average processing is performed on pixel values of pixels in the migration image and a reference image, and the migration image is updated according to a weighted average processing result.
Specifically, whether target migration result information of the migration image meets preset migration conditions is judged, wherein the preset migration conditions can be that target migration result scores in the target migration result information are smaller than a score threshold value, and if the target migration result scores are smaller than the score threshold value, the target migration result information is judged to meet the preset migration conditions. The preset migration conditions and the score threshold value can only be set according to the actual application scene, and the method is not particularly limited.
In addition, when the target migration result information of the migration image does not meet the preset migration condition, carrying out weighted average on each pixel value of the migration image and the reference image at the same position, and taking the pixel value after weighted average as the pixel value of the updated migration image. For example, if the RGB pixel value at a certain pixel point in the transition image is (100,100,100), the RGB pixel value at the same pixel point in the reference image is (60,60,60). The same weight value may be set to 0.5 for the pixel values of the reference image and the migrated image, then the average pixel value at the same pixel point in the migrated image and the reference image may be obtained as (80,80,80), and the RGB pixel value at the pixel point in the migrated image is updated as (80,80,80). Of course, different weight values may also be set for the pixel values of the reference image and the migrated image, and the RGB pixel values at that pixel point in the migrated image are updated to (70,70,70). The weight values of the pixel values of the reference image and the migration image may be set according to an actual application scene, which is not particularly limited in the present disclosure.
Fig. 7 is a schematic flow chart of an information obtaining method according to the present embodiment, and as shown in fig. 7, the flow includes at least steps S710 to S780, which are described in detail as follows:
in step S710, a migration image and a reference image are acquired;
the migration image is an image obtained after color migration of the target image according to the reference image.
In step S720, the migration image of the reference image is projected by using the projection vector, so as to obtain a reference projection image corresponding to the reference image and a migration projection image corresponding to the migration image;
Wherein a plurality of projection vectors may be set, and a plurality of reference projection maps and a migration projection map are obtained based on the plurality of projection vectors.
In step S730, histogram matching is performed on the reference projection map and the migration projection map to obtain a projection matching map;
in step S740, obtaining pixel differences between the pixel values of each pixel in the projection matching diagram and the pixel values of each pixel in the migration projection diagram, calculating an average value of the pixel differences corresponding to all the pixel values, and configuring the average value of the pixel differences as first migration result information;
In step S750, pixel values of the reference image are input into the gaussian mixture model to obtain a probability distribution model corresponding to the reference image;
In step S760, the pixel values of the pixels in the migrated image are input into the probability distribution model corresponding to the reference image to obtain the distribution information corresponding to the pixels in the migrated image;
In step S770, the distribution information corresponding to all the pixels in the migration image is summed to obtain the second migration result information of the migration image;
In step S780, weight values are assigned to the first migration result information and the second migration result information of the migration image, and target migration result information is obtained according to the weight values corresponding to the first migration result information and the second migration result information.
The method for obtaining target migration result information in the embodiment comprises the steps of firstly calculating the occurrence probability of pixels in a migration image in a reference image by using maximum likelihood to obtain first migration result information, then projecting the reference image and the migration image to a single channel by using edge probability errors, carrying out histogram matching on a projection image of the single channel, obtaining second migration result information according to a matching result, and finally obtaining the target migration result information according to the first migration result information and the second migration result information. According to the information acquisition method, the color migration result is objectively analyzed from the global and detail angles, so that the accuracy of target migration result information is improved, the migration image is updated according to the color migration result, the accuracy of color migration is improved, and further user experience is improved.
Those skilled in the art will appreciate that all or part of the steps implementing the above embodiments are implemented as a computer program executed by a CPU. When executed by a CPU, performs the functions defined by the above-described method provided by the present invention. The program may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic disk or an optical disk, etc.
Furthermore, it should be noted that the above-described figures are merely illustrative of the processes involved in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
The following describes an embodiment of an apparatus of the present disclosure that may be used to perform the above-described information acquisition method of the present disclosure. For details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the information acquisition method described above in the present disclosure.
Fig. 8 schematically illustrates a block diagram of an information acquisition apparatus according to one embodiment of the present disclosure.
Referring to fig. 8, an information acquisition apparatus 800 according to one embodiment of the present disclosure includes an image acquisition module 801, an image projection module 802, and an information acquisition module 803. Specifically:
An image obtaining module 801, configured to obtain a migration image and a reference image, where the migration image is an image obtained after performing color migration on a target image according to the reference image;
an image projection module 802, configured to project the reference image and the migration image by using the projection vector, so as to obtain a reference projection image corresponding to the reference image and a migration projection image corresponding to the migration image;
the information obtaining module 803 is configured to perform histogram matching on the reference projection map and the migration projection map to obtain a projection matching map, and obtain target migration result information of the migration image according to the projection matching map and the migration projection map.
In an exemplary embodiment of the present disclosure, the information acquisition module 803 may further include a first information acquisition unit, a second information acquisition unit, and a target information acquisition unit, wherein:
The first information acquisition unit is used for acquiring first migration result information of the migration image according to the projection matching diagram and the pixel values of all pixels in the migration projection diagram;
the second information acquisition unit is used for constructing a distribution model of the reference image according to the pixel information of the reference image, and calculating second migration result information corresponding to the migration image according to the distribution model;
the target information acquisition unit is used for acquiring target migration result information of the migration image according to the first migration result information and the second migration result information.
In an exemplary embodiment of the present disclosure, the second information obtaining unit may be further configured to input pixel values of the reference image into the gaussian mixture model to obtain a probability distribution model corresponding to the reference image.
In an exemplary embodiment of the present disclosure, the second information obtaining unit may be further configured to input a pixel value of each pixel in the migration image into the probability distribution model to obtain distribution information corresponding to each pixel in the migration image, and sum the distribution information corresponding to each pixel in the migration image to obtain second migration result information.
In an exemplary embodiment of the present disclosure, the first information obtaining unit may be further configured to obtain a pixel difference between a pixel value of each pixel in the projection matching graph and a pixel value of each pixel in the migration projection graph, and configure a sum of pixel differences corresponding to each pixel as the first migration result information.
In an exemplary embodiment of the present disclosure, the target information obtaining unit may be further configured to assign a weight value to the first migration result information and the second migration result information, and calculate target migration result information of the migration image according to the weight values corresponding to the first migration result information and the second migration result information.
In an exemplary embodiment of the present disclosure, the information obtaining apparatus 800 may further include an image update module (not shown in the figure), where the image update module is configured to perform weighted average processing on pixel values of each pixel in the migration image and the reference image when the target migration result information of the migration image does not meet the preset migration condition, and update the migration image according to the weighted average processing result.
The specific details of each module in the above information acquisition device are already described in the embodiment of the information acquisition method section, and the details not disclosed can be referred to the embodiment of the information acquisition method section, so that the details are not described again.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, aspects of the present disclosure may be embodied in the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects that may be referred to herein collectively as a "circuit," module, "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device, e.g. any one or more of the steps of fig. 3 to 7 may be carried out.
Exemplary embodiments of the present disclosure also provide a program product for implementing the above method, which may employ a portable compact disc read-only memory (CD-ROM) and comprise program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of a readable storage medium include an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.