[go: up one dir, main page]

CN103473747A - Method and device for transmitting colors - Google Patents

Method and device for transmitting colors Download PDF

Info

Publication number
CN103473747A
CN103473747A CN2013104413422A CN201310441342A CN103473747A CN 103473747 A CN103473747 A CN 103473747A CN 2013104413422 A CN2013104413422 A CN 2013104413422A CN 201310441342 A CN201310441342 A CN 201310441342A CN 103473747 A CN103473747 A CN 103473747A
Authority
CN
China
Prior art keywords
color
original image
pixel point
ctwc
centerdot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2013104413422A
Other languages
Chinese (zh)
Inventor
普园媛
徐丹
苏迤
魏小敏
赵征鹏
王朝晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan University YNU
Original Assignee
Yunnan University YNU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan University YNU filed Critical Yunnan University YNU
Priority to CN2013104413422A priority Critical patent/CN103473747A/en
Publication of CN103473747A publication Critical patent/CN103473747A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a method and device for transmitting colors. The method is carried out based on L alpha beta space, and comprises the steps that one or more pairs of corresponding areas are selected in an original image and a target image according to a color transmitting task; a CTWC is set for each pixel in the original image in each pair of corresponding areas, and the CTWCs of the pixels are inversely proportional to the color statistical distance between the pixels and the currently-selected area of the original image; according to the CTWCs of the pixels of each pair of corresponding areas, color transmission linear conversion is carried out on the original image, and the converted image is obtained. The method and device can achieve local color transmission, and meet the diversified demands of users.

Description

Method and apparatus for color delivery
Technical Field
The present invention relates to the field of image processing, and in particular, to a method and apparatus for color delivery.
Background
Color transfer is a method for automatically adding or changing the color of an image by a specific method, that is, designating an original image and a target image, and transferring color information in the target image to the original image by a color transfer method so that the color of the original image is changed to have a color characteristic similar to that of the target image.
The prior art color transfer is generally directed to global color transfer, and the global color transfer method is based on the criterion that similar images have similar statistical information. The method carries out linear transformation on the first-order and second-order statistical characteristics of the original image and the target image, so that the original image has the statistical characteristics similar to those of the target image, and the transformed original image visually achieves the effect similar to the color of the target image.
However, the global color delivery method transforms the colors of the whole original image, and cannot meet the requirement of user personalization.
Aiming at the problem that the global color transmission mode cannot meet the personalized requirements of users, an effective solution is not provided at present.
Disclosure of Invention
An object of the present invention is to provide a method and an apparatus for color transfer, so as to solve the above problems.
According to an aspect of the embodiments of the present invention, there is provided a method of color delivery, the method being performed based on an L α β color space, including: selecting one or more pairs of corresponding regions in the original image and the target image according to the color transmission task; setting a CTWC (computer to communication) for each pixel point in the original image aiming at each pair of corresponding areas, wherein the CTWC of the pixel point is inversely proportional to the color statistical distance from the pixel point to the currently selected area of the original image; and carrying out linear transformation of color transmission on the original image according to the CTWC of the pixel points corresponding to each pair of corresponding areas to obtain a transformed image.
According to another aspect of the embodiments of the present invention, there is provided an apparatus for color transfer, including: the region selection module is used for selecting one or more pairs of corresponding regions in the original image and the target image according to the color transmission task; the color transfer weight coefficient setting module is used for setting CTWC (China railway high-performance computing) for each pixel point in the original image aiming at each pair of corresponding areas selected by the area selection module based on the L alpha beta color space, wherein the CTWC of the pixel point is inversely proportional to the color statistical distance from the pixel point to the currently selected area of the original image; and the color linear transformation module is used for performing linear transformation of color transmission on the original image according to the CTWC of each pair of pixel points corresponding to the corresponding area, which is set by the color transmission weight coefficient setting module, so as to obtain a transformed image.
The embodiment of the invention adopts the L alpha beta color space with stronger color channel independence, sets the inverse proportion of the CTWC of the pixel point to the color statistical distance from the pixel point to the currently selected area of the original image, and then carries out color transmission based on the CTWC of each pixel point, thereby achieving better local color transmission effect and adapting to the diversified requirements of users.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow chart of a method of color delivery provided by an embodiment of the present invention;
FIG. 2 is a schematic diagram of a local color transfer mapping region according to an embodiment of the present invention;
fig. 3 is a block diagram of an apparatus for color delivery according to an embodiment of the present invention.
Detailed Description
The invention will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In consideration of the fact that users sometimes need to perform local Color transfer of images, the embodiment of the invention provides a method and a device for Color transfer, which are mainly based on the local Color transfer performed by CTWC (Color transfer weight Coefficients).
Meanwhile, the L alpha beta color space is more in line with the human visual perception system, and when the L alpha beta color space is applied to a natural scene, the L alpha beta space obviously reduces the correlation among the color channels, so that the channels have certain mutual independence, the influence of the change of one channel on the other two channels can be reduced to the maximum extent, different operations can be carried out on different color channels, and the problem of channel intersection can not occur. In order to achieve a better local color transfer effect, the embodiment of the present invention selects the L α β color space as the execution space of the local color transfer. Where L channel denotes an achromatic channel, i.e. a luminance channel, α denotes a chromatic yellow-blue channel, and β denotes a chromatic red-green channel.
Referring to the flowchart of the method for color delivery shown in fig. 1, the method is performed based on L α β color space, and includes the following steps:
step S102, selecting one or more pairs of corresponding areas in the original image and the target image according to the color transmission task;
step S104, setting a CTWC (China railway communication center) for each pixel point in the original image aiming at each pair of corresponding areas, wherein the CTWC of the pixel point is inversely proportional to the color statistical distance from the pixel point to the currently selected area of the original image;
and step S106, performing linear transformation of color transmission on the original image according to the CTWC of the pixel points corresponding to each pair of corresponding areas to obtain a transformed image.
The method of the embodiment adopts the L alpha beta color space with strong color channel independence, sets the inverse proportion of the CTWC of the pixel point to the color statistical distance from the pixel point to the currently selected area of the original image, and then carries out color transmission based on the CTWC of each pixel point, thereby achieving better local color transmission effect and adapting to the diversified requirements of users.
According to the requirement of color transmission, the selected areas in the original image and the target image can be marked by rectangular frames respectively, and the color range of pixels in the rectangular frames determines the color range required to be transmitted between the original image and the target image. In selecting the area, a plurality of pairs of corresponding areas may be selected according to actual needs, and in order to make the area selected in the original image correspond to the area selected in the target image one-to-one, the step of selecting one or more pairs of corresponding areas in the original image and the target image according to the color delivery task in this embodiment may include: selecting corresponding areas in the original image and the target image one by one according to the color transmission task; and setting the same mark for the corresponding areas selected in the original image and the target image. Through the simple and easy mode, two areas with the same identification can be determined as the areas corresponding to color transmission.
As shown in fig. 2, the map in the left frame is the original map, and the map in the right frame is the target map. Wherein,
Figure BDA0000387409850000041
and
Figure BDA0000387409850000042
for the corresponding areas in the k-th pair of the original and target maps,
Figure BDA0000387409850000043
and
Figure BDA0000387409850000044
for the k-th pair of corresponding areas in the original and target maps, cs(i, j) is the value of a color channel of the pixel (i, j) in the original image. According to the requirement of color transmission, corresponding areas are respectively selected from the original image and the target image, the selected areas are marked by rectangular frames, and the color range of pixels in the rectangular frames determines the color range required to be transmitted between the original image and the target image. When a plurality of pairs of corresponding areas are selected, the areas in the original image and the target image correspond one to one.
When setting the CTWC of each pixel point, the CTWC weight coefficient may be constructed by using a low-pass Butterworth (Butterworth) filter spectrum function, where the CTWC weight coefficient represents the influence degree of the pixel point in the original image by the color transmission of the selected area, and the weight coefficient is inversely proportional to the color statistical distance from the pixel point to the selected area. First-order and second-order statistical characteristics, namely, a mean value and a standard deviation, of the original image and the target image are calculated in each corresponding region, and the degree of linear transformation is determined according to the CTWC during linear transformation. The mean value of each color channel of the kth selection area in the original image is set as
Figure BDA0000387409850000045
The value of each color channel of other pixel points (i, j) outside the kth selection area in the original image is
Figure BDA0000387409850000051
The statistical distance of the color between the pixel point (i, j) and the selected region can be calculated by the following formula;
x ij k = ( l ij k - μ l k ) 2 + ( α ij k - μ α k ) 2 + ( β ij k - μ β k ) 2 ;
to calculate CTWC, a weight function w (x) is constructed, which is statistically distant from the color
Figure BDA0000387409850000053
In inverse proportion, i.e.
Figure BDA0000387409850000054
The larger the size, the more the pixel point (i, j) and the color of the selected area areThe larger the gap, the less affected by color transfer; if it is
Figure BDA0000387409850000055
The smaller the difference between the color of the pixel point (i, j) and the color of the selected area, the larger the influence of color transmission. The present embodiment calculates CTWC with reference to a low-pass Butterworth filter. The Butterworth filter is characterized in that a frequency response curve in a pass band is flat to the maximum extent and has no fluctuation, and the frequency response curve gradually drops to zero in a stop band. On the bode plot of the logarithm of the amplitude against the angular frequency, starting from a certain boundary angular frequency, the amplitude decreases gradually with increasing angular frequency, tending to be negative infinite. Its low pass filter can be expressed by the following equation of amplitude squared versus frequency:
| H ( w ) | 2 = 1 1 + ( w w c ) 2 n = 1 1 + ϵ 2 ( w w ρ ) 2 n
where n = order of filter, wcFrequency at which the amplitude drops to-3 dB, wp= passband edge frequency.
The frequency w of the filter corresponds to the statistical distance of the colors
Figure BDA0000387409850000057
Cut-off frequency wcCorresponding to the statistical distance x of the color cut-offc. Thus, the CTWC weight function in the form of a Butterworth filter can be represented by the following equation.
w ( x ij k ) = 1 1 + ( x ij k / x c ) 2 N
Wherein x iscThe statistical distance is cut off for the set color, and N is the set color filtering order; from the above formula, the color cutoff statistical distance xcAnd the order N of the filter can be used to control the effect of the local color transfer.
For the kth selection region, the CTWC of the pixel points therein may be set to 1.
The CTWC is arranged in a mode that a good local color transfer effect can be achieved, and the influence on other areas is small.
When the corresponding areas selected in the original image and the target image are in a pair, a formula can be adopted C s new ( i , j ) = C s ( i , j ) + w ( x ij ) · ( μ t + σ t σ s ( C s ( i , j ) - μ s ) - C s ( i , j ) ) Carrying out linear transformation of color transmission on one color channel of the original image; wherein,
Figure BDA0000387409850000062
the numerical value of the corresponding color channel after the pixel point (i, j) in the original image is transformed; cs(i, j) is the value of the color channel corresponding to the pixel point (i, j), w (x)ij) Is CTWC, μ of pixel point (i, j)stThe mean values, σ, of the color channels corresponding to the corresponding regions of the original and target images, respectivelystThe standard deviations of the color channels corresponding to the corresponding areas of the original image and the target image are respectively.
When the corresponding area selected in the original image and the target image is M (M > 1) pairs, that is, the current task is a multi-area local color transfer task, the multi-area local color transfer can be realized by adopting an image iterative fusion method or a weighted average fusion method.
The image iterative fusion method comprises the following steps: and sequentially performing local color transfer according to the sequence of the selected regions, and finishing the next transfer on the basis of the previous transfer result, so that the finally obtained result graph is the result of iterative color transfer of all corresponding regions. The concrete implementation is as follows:
according to the sequence of the selected regions, the following formula is adopted to carry out local color transmission on each pair of corresponding regions in sequence:
C s k ( i , j ) = C s k - 1 ( i , j ) + w ( x ij k ) · ( μ t k + σ t k σ s k ( C s k - 1 ( i , j ) - μ s k ) - C s k - 1 ( i , j ) ) ( k = 1,2,3 , · · · , M )
wherein, M is more than 1,
Figure BDA0000387409850000064
is the initial value of a color channel of the pixel point (i, j) in the original image,
Figure BDA0000387409850000065
the result graph is obtained after the color of the color channel corresponding to the kth corresponding area is transmitted;
Figure BDA0000387409850000066
is the CTWC of the pixel point (i, j) for the k-th corresponding region,
Figure BDA0000387409850000067
respectively are the mean values of the color channels corresponding to the kth corresponding area in the original image and the target image,
Figure BDA0000387409850000071
and the standard deviation of the color channel corresponding to the kth corresponding area in the original image and the target image.
The weighted average fusion method comprises the following steps: and (3) carrying out local color transfer on each pair of corresponding areas independently, and then carrying out weighted average on the results obtained by each pair of corresponding areas by utilizing respective CTWC (China Mobile Committee) to obtain a final result graph. The method comprises the following basic steps:
1) the following formula is adopted to separately perform local color transfer on each pair of corresponding regions:
C s k ( i , j ) = C s ( i , j ) + w ( x ij k ) · ( μ t k + σ t k σ s k ( C s ( i , j ) - μ s k ) - C s ( i , j ) ) ( k = 1,2 , · · · , M )
wherein, M is more than 1,
Figure BDA0000387409850000073
the value of a pixel point (i, j) in the original image after color transmission to a color channel of a kth corresponding area; cs(i, j) is the value of the color channel corresponding to the pixel point (i, j),
Figure BDA0000387409850000074
is the CTWC of the pixel point (i, j) for the k-th corresponding region,respectively are the mean values of the color channels corresponding to the kth corresponding area in the original image and the target image,
Figure BDA0000387409850000076
the standard deviation of the color channel corresponding to the kth corresponding area in the original image and the target image is obtained;
in this step, for each pixel point (i, j) in the original image, the color statistical distance between the pixel point and each corresponding region can be calculated according to the above manner
Figure BDA0000387409850000077
Then obtaining the pixel for the region CTWCw ( x ij k ) ( k = 1,2 , · · · , M ) ;
2) For each pair of corresponding regions, calculating the CTWC weighted average value of the pixel points (i, j)
Figure BDA0000387409850000079
p ( x ij k ) = w ( x ij k ) Σ k = 1 M w ( x ij k ) ;
This step corresponds to a separate local color transfer for each pair of corresponding regions;
3) calculating the result of the pixel point (i, j) after color transmission is carried out on each pair of corresponding regions C s new ( i , j ) = Σ k = 1 M p ( x ij k ) · C s k ( i , j ) .
And processing each pixel point according to the method to obtain a final color transfer result image.
According to the method, the CTWC is constructed for the pixel points of the original image according to the color transmission corresponding areas selected from the original image and the target image, and the influence degree of the original image on the color transmission of the target image is determined, so that the aim of local color transmission is fulfilled. In addition, the multi-region local color transfer image fusion is realized by using an iterative image fusion method or a weighted average fusion method, so that the transfer of multi-region local colors is realized.
Corresponding to the above method, an embodiment of the present invention further provides a device for color transfer, referring to the structural block diagram of the device for color transfer shown in fig. 3, where the device includes the following modules:
a region selection module 32, configured to select one or more pairs of corresponding regions in the original image and the target image according to the color delivery task;
a color transfer weight coefficient setting module 34, configured to set a color transfer weight coefficient CTWC for each pixel in the original image for each pair of corresponding regions selected by the region selection module 32 based on the L α β color space, where the CTWC of a pixel is inversely proportional to a color statistical distance from the pixel to a currently selected region of the original image;
and the color linear transformation module 36 is configured to perform linear transformation of color transmission on the original image according to the CTWC of each pair of pixels corresponding to each corresponding region set by the color transmission weight coefficient setting module 34, so as to obtain a transformed image.
The device of this embodiment adopts the stronger L alpha beta color space of color channel independence to the color statistics distance in the region that the CTWC that sets up the pixel is to this pixel to the original image is selected at present is in inverse proportion, then carries out the color transmission based on the CTWC of each pixel, can reach better local color transmission effect, adapts to user's diversified demand.
Preferably, the color transfer weight coefficient setting module 34 includes: a color statistical distance calculating unit for calculating the color statistical distance between other pixel points (i, j) except the k-th selection region and the k-th selection region in the original imageWherein the mean value of each color channel in the L alpha beta color space of the k-th selection area is
Figure BDA0000387409850000091
The value of each color channel in the L alpha beta color space of the pixel point (i, j) is
Figure BDA0000387409850000092
A first setting unit configured to set the CTWC of the pixel point (i, j) to:
Figure BDA0000387409850000093
wherein x iscThe statistical distance is cut off for the set color, and N is the set color filtering order; and the second setting unit is used for setting the CTWC of the pixel point in the kth selection area to be 1.
For the case that a pair of corresponding regions is selected from the original image and the target image, the color linear transformation module 36 may include: a first linear transformation unit, configured to, when the corresponding area selected in the original image and the target image is a pair, perform linear transformation of color delivery on one color channel of the original image by using the following formula:
C s new ( i , j ) = C s ( i , j ) + w ( x ij ) · ( μ t + σ t σ s ( C s ( i , j ) - μ s ) - C s ( i , j ) )
wherein,
Figure BDA0000387409850000095
is the color channel corresponding to the pixel point (i, j) in the original image after transformationThe value of (d); cs(i, j) is the value of the color channel corresponding to the pixel point (i, j), w (x)ij) Is CTWC, μ of pixel point (i, j)stThe mean values, σ, of the color channels corresponding to the corresponding regions of the original and target images, respectivelystThe standard deviations of the color channels corresponding to the corresponding areas of the original image and the target image are respectively.
For the case that a plurality of pairs of corresponding regions are selected in the original image and the target image, the color linear transformation module 36 may adopt two specific implementation manners, one is: the color linear transformation module 36 includes: and the iterative fusion unit is used for sequentially carrying out local color transmission on each pair of corresponding areas by adopting the following formula according to the sequence of the selected areas when the corresponding areas selected in the original image and the target image are M pairs:
C s k ( i , j ) = C s k - 1 ( i , j ) + w ( x ij k ) · ( μ t k + σ t k σ s k ( C s k - 1 ( i , j ) - μ s k ) - C s k - 1 ( i , j ) ) ( k = 1,2,3 , · · · , M )
wherein, M is more than 1,
Figure BDA0000387409850000097
is the initial value of a color channel of the pixel point (i, j) in the original image,
Figure BDA0000387409850000098
the result graph is obtained after the color of the color channel corresponding to the kth corresponding area is transmitted;
Figure BDA0000387409850000099
is the CTWC of the pixel point (i, j) for the k-th corresponding region,
Figure BDA0000387409850000101
respectively are the mean values of the color channels corresponding to the kth corresponding area in the original image and the target image,
Figure BDA0000387409850000102
the standard deviation of the color channel corresponding to the kth corresponding area in the original image and the target image is obtained;
in a second mode, the color linear transformation module 36 includes the following units:
the local color transfer unit is used for respectively and independently performing local color transfer on each pair of corresponding areas by adopting the following formula when the corresponding areas selected in the original image and the target image are M pairs: C s k ( i , j ) = C s ( i , j ) + w ( x ij k ) · ( μ t k + σ t k σ s k ( C s ( i , j ) - μ s k ) - C s ( i , j ) ) ( k = 1,2 , · · · , M ) ;
wherein, M is more than 1,the value of a pixel point (i, j) in the original image after color transmission to a color channel of a kth corresponding area; cs(i, j) is the value of the color channel corresponding to the pixel point (i, j),
Figure BDA0000387409850000105
is the CTWC of the pixel point (i, j) for the k-th corresponding region,
Figure BDA0000387409850000106
respectively are the mean values of the color channels corresponding to the kth corresponding area in the original image and the target image,
Figure BDA0000387409850000107
the standard deviation of the color channel corresponding to the kth corresponding area in the original image and the target image is obtained;
a weighted average calculation unit for calculating the CTWC weighted average of the pixel points (i, j) for each pair of corresponding regions
p ( x ij k ) = w ( x ij k ) Σ k = 1 M w ( x ij k ) ;
A weighted average fusion unit for calculating the result of color transmission of each pair of corresponding regions by the pixel points (i, j) C s new ( i , j ) = Σ k = 1 M p ( x ij k ) · C s k ( i , j ) .
The embodiment realizes local color transfer to the image on the basis of the color transfer weight coefficient, and has the advantages of higher directivity of a transfer region and high flexibility. In addition, after an iterative image fusion mode or a weighted average fusion mode is used, local color transfer of multiple regions can be well achieved.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method of color delivery, the method being based on L α β color space, comprising:
selecting one or more pairs of corresponding regions in the original image and the target image according to the color transmission task;
setting a Color Transfer Weight Coefficient (CTWC) for each pixel point in the original image aiming at each pair of corresponding areas, wherein the CTWC of the pixel point is in inverse proportion to the color statistical distance from the pixel point to the currently selected area of the original image;
and carrying out linear transformation of color transmission on the original image according to the CTWC of the pixel points corresponding to each pair of corresponding areas to obtain a transformed image.
2. The method of claim 1, wherein selecting one or more pairs of corresponding regions in the original and target maps according to the color delivery task comprises:
selecting corresponding areas in the original image and the target image one by one according to the color transmission task;
and setting the same mark for the corresponding areas selected from the original image and the target image.
3. The method of claim 1, wherein setting a CTWC for each pixel point in the artwork for each pair of the corresponding regions comprises:
calculating the color statistical distance between other pixel points (i, j) outside the kth selection area and the kth selection area in the original image s ij k = ( l ij k - μ l k ) 2 + ( α ij k - μ α k ) 2 + ( β ij k - μ β k ) 2 ; Wherein the average value of each color channel of the k-th selection area is
Figure FDA0000387409840000012
The value of each color channel of the pixel point (i, j) is
Figure FDA0000387409840000013
Setting the CTWC of the pixel point (i, j) as:
Figure FDA0000387409840000021
wherein x iscThe statistical distance is cut off for the set color, and N is the set color filtering order;
and setting the CTWC of the pixel points in the kth selection area to be 1.
4. The method of claim 1, wherein performing a linear transformation of the color delivery on the original image according to the CTWC of the pixel points corresponding to each pair of the corresponding regions comprises:
when the corresponding areas selected in the original image and the target image are a pair, adopting a formula C s new ( i , j ) = C s ( i , j ) + w ( x ij ) · ( μ t + σ t σ s ( C s ( i , j ) - μ s ) - C s ( i , j ) ) Carrying out linear transformation of color transmission on one color channel of the original image;
wherein,
Figure FDA0000387409840000023
the numerical value of the pixel point (i, j) in the original image is converted and corresponds to the color channel; cs(i, j) is the value of the pixel point (i, j) corresponding to the color channel, w (x)ij) Is CTWC, μ of said pixel point (i, j)stThe mean, σ, of the color channels corresponding to the corresponding regions of the original image and the target image, respectivelystThe standard deviations of the color channels corresponding to the corresponding areas of the original image and the target image are respectively.
5. The method of claim 1, wherein performing a linear transformation of the color delivery on the original image according to the CTWC of the pixel points corresponding to each pair of the corresponding regions comprises:
when the corresponding areas selected in the original image and the target image are M pairs, sequentially performing local color transmission on each pair of corresponding areas by adopting the following formula according to the sequence of the selected areas:
C s k ( i , j ) = C s k - 1 ( i , j ) + w ( x ij k ) · ( μ t k + σ t k σ s k ( C s k - 1 ( i , j ) - μ s k ) - C s k - 1 ( i , j ) ) ( k = 1,2,3 , · · · , M )
wherein, M is more than 1,
Figure FDA0000387409840000025
is the initial value of a color channel of the pixel point (i, j) in the original image,
Figure FDA0000387409840000026
the result graph is obtained after the color of the kth corresponding area corresponding to the color channel is transmitted;
Figure FDA0000387409840000027
is the CTWC of the pixel point (i, j) for the k-th corresponding region,
Figure FDA0000387409840000028
respectively, the average values of the color channels corresponding to the kth corresponding area in the original image and the target image,and the standard deviation of the color channel corresponding to the kth corresponding area in the original image and the target image is obtained.
6. The method of claim 1, wherein performing a linear transformation of the color delivery on the original image according to the CTWC of the pixel points corresponding to each pair of the corresponding regions comprises:
when the corresponding areas selected in the original image and the target image are M pairs, respectively and independently performing local color transmission on each pair of corresponding areas by adopting the following formula: C s k ( i , j ) = C s ( i , j ) + w ( x ij k ) · ( μ t k + σ t k σ s k ( C s ( i , j ) - μ s k ) - C s ( i , j ) ) ( k = 1,2 , · · · , M )
wherein, M is more than 1,the value is the value of the pixel point (i, j) in the original image after the color transmission of one color channel of the kth corresponding area; cs(i, j) is the value of the pixel point (i, j) corresponding to the color channel,is the CTWC of the pixel point (i, j) for the k-th corresponding region,
Figure FDA0000387409840000035
respectively, the average values of the color channels corresponding to the kth corresponding area in the original image and the target image,
Figure FDA0000387409840000036
the standard deviation of the color channel corresponding to the kth corresponding area in the original image and the target image is obtained;
for each pair of said corresponding regions, calculating a CTWC weighted average of said pixel points (i, j) p ( x ij k ) :
p ( x ij k ) = w ( x ij k ) Σ k = 1 M w ( x ij k )
Calculating the imageThe result of color transfer of the pixel points (i, j) for each pair of the corresponding regions C s new ( i , j ) = Σ k = 1 M p ( x ij k ) · C s k ( i , j ) .
7. An apparatus for color delivery, comprising:
the region selection module is used for selecting one or more pairs of corresponding regions in the original image and the target image according to the color transmission task;
a color transfer weight coefficient setting module, configured to set a color transfer weight coefficient CTWC for each pixel point in the original image for each pair of corresponding regions selected by the region selection module based on an L α β color space, where the CTWC of the pixel point is inversely proportional to a color statistical distance from the pixel point to a currently selected region of the original image;
and the color linear transformation module is used for performing linear transformation of color transmission on the original image according to the CTWC of each pair of pixel points corresponding to the corresponding area, which is set by the color transmission weight coefficient setting module, so as to obtain a transformed image.
8. The apparatus of claim 7, wherein the color transfer weight setting module comprises:
a color statistical distance calculating unit for calculating the color statistical distance between other pixel points (i, j) except the k-th selection region and the k-th selection region in the original image
Figure FDA0000387409840000041
Wherein the mean value of each color channel in L alpha beta color space of the k-th selection area is
Figure FDA0000387409840000042
The value of each color channel in the L alpha beta color space of the pixel point (i, j) is
Figure FDA0000387409840000043
A first setting unit configured to set the CTWC of the pixel point (i, j) to:wherein x iscThe statistical distance is cut off for the set color, and N is the set color filtering order;
and the second setting unit is used for setting the CTWC of the pixel point in the kth selection area to be 1.
9. The apparatus of claim 7, wherein the color linear transformation module comprises:
a first linear transformation unit, configured to perform linear transformation of color delivery on one color channel of the original image by using the following formula when the corresponding area selected in the original image and the target image is a pair:
C s new ( i , j ) = C s ( i , j ) + w ( x ij ) · ( μ t + σ t σ s ( C s ( i , j ) - μ s ) - C s ( i , j ) )
wherein,the numerical value of the pixel point (i, j) in the original image is converted and corresponds to the color channel; cs(i, j) is the value of the pixel point (i, j) corresponding to the color channel, w (x)ij) Is CTWC, μ of said pixel point (i, j)stThe mean, σ, of the color channels corresponding to the corresponding regions of the original image and the target image, respectivelystThe standard deviations of the color channels corresponding to the corresponding areas of the original image and the target image are respectively.
10. The apparatus of claim 7,
the color linear transformation module includes: and the iterative fusion unit is used for sequentially performing local color delivery on each pair of corresponding regions by adopting the following formula according to the sequence of the selected regions when the corresponding regions selected in the original image and the target image are M pairs:
C s k ( i , j ) = C s k - 1 ( i , j ) + w ( x ij k ) · ( μ t k + σ t k σ s k ( C s k - 1 ( i , j ) - μ s k ) - C s k - 1 ( i , j ) ) ( k = 1,2,3 , · · · , M )
wherein, M is more than 1,
Figure FDA0000387409840000054
is the initial value of a color channel of the pixel point (i, j) in the original image,
Figure FDA0000387409840000055
the result graph is obtained after the color of the kth corresponding area corresponding to the color channel is transmitted;
Figure FDA0000387409840000056
is as followsThe pixel point (i, j) is corresponding to the CTWC of the kth corresponding area,respectively, the average values of the color channels corresponding to the kth corresponding area in the original image and the target image,
Figure FDA0000387409840000058
the standard deviation of the color channel corresponding to the kth corresponding area in the original image and the target image is obtained;
or, the color linear transformation module includes:
a local color transfer unit, configured to, when the corresponding regions selected in the original image and the target image are M pairs, separately perform local color transfer on each pair of corresponding regions by using the following formula: C s k ( i , j ) = C s ( i , j ) + w ( x ij k ) · ( μ t k + σ t k σ s k ( C s ( i , j ) - μ s k ) - C s ( i , j ) ) ( k = 1,2 , · · · , M ) ;
wherein, M is more than 1,
Figure FDA0000387409840000061
is the color of the pixel point (i, j) in the original image to a color channel of the kth corresponding areaThe value after the transfer of the color; cs(i, j) is the value of the pixel point (i, j) corresponding to the color channel,
Figure FDA0000387409840000062
is the CTWC of the pixel point (i, j) for the k-th corresponding region,respectively, the average values of the color channels corresponding to the kth corresponding area in the original image and the target image,
Figure FDA0000387409840000064
the standard deviation of the color channel corresponding to the kth corresponding area in the original image and the target image is obtained;
a weighted average calculation unit for calculating the CTWC weighted average of the pixel points (i, j) for each pair of the corresponding regions
Figure FDA0000387409840000065
p ( x ij k ) = w ( x ij k ) Σ k = 1 M w ( x ij k ) ;
A weighted average fusion unit for calculating the result of the color transmission of the pixel point (i, j) to each pair of the corresponding regions
Figure FDA0000387409840000067
CN2013104413422A 2013-09-25 2013-09-25 Method and device for transmitting colors Pending CN103473747A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2013104413422A CN103473747A (en) 2013-09-25 2013-09-25 Method and device for transmitting colors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2013104413422A CN103473747A (en) 2013-09-25 2013-09-25 Method and device for transmitting colors

Publications (1)

Publication Number Publication Date
CN103473747A true CN103473747A (en) 2013-12-25

Family

ID=49798581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2013104413422A Pending CN103473747A (en) 2013-09-25 2013-09-25 Method and device for transmitting colors

Country Status (1)

Country Link
CN (1) CN103473747A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108777783A (en) * 2018-07-09 2018-11-09 广东交通职业技术学院 A kind of image processing method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6058207A (en) * 1995-05-03 2000-05-02 Agfa Corporation Selective color correction applied to plurality of local color gamuts
CN1947147A (en) * 2004-04-28 2007-04-11 三洋电机株式会社 Portable telephone, image converter, control method and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6058207A (en) * 1995-05-03 2000-05-02 Agfa Corporation Selective color correction applied to plurality of local color gamuts
CN1947147A (en) * 2004-04-28 2007-04-11 三洋电机株式会社 Portable telephone, image converter, control method and program

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
YU-WING TAI等: "Local color transfer via probabilistic segmentation by expectation-maximization", 《COMPUTER VISION AND PATTERN RECOGNITION, 2005. CVPR 2005. IEEE COMPUTER SOCIETY CONFERENCE ON》 *
何永强等: "基于融合和色彩传递的灰度图像彩色化技术", 《红外技术》 *
普园媛: "云南重彩画艺术风格的数字模拟及合成技术研究", 《中国博士学位论文全文数据库》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108777783A (en) * 2018-07-09 2018-11-09 广东交通职业技术学院 A kind of image processing method and device

Similar Documents

Publication Publication Date Title
CN104063844B (en) A kind of reduced graph generating method and system
CN105118475B (en) The brightness of LED display or bright chroma adjusting method
CN111742545A (en) Exposure control method and device and movable platform
CN101896826A (en) Radio wave propagation analysis result display system
DE102018119625A1 (en) Reduction of structured IR patterns in stereoscopic depth sensor imaging
DE102015116757A1 (en) SYSTEMS AND METHOD FOR ESTIMATING THE DISPLACEMENT FREQUENCY SUBSTITUTE FOR LONG-TRAINING FIELDS
CN109817170A (en) Pixel compensation method, device and terminal device
DE102020128896A1 (en) REAL-TIME HOLOGRAPHY BY USING LEARNED ERROR FEEDBACK
DE102015109576B4 (en) Method for automatically selecting a lawful communication channel used by mobile electronic devices and mobile electronic devices using the same.
DE102020125740A1 (en) Vehicle telematics systems with MIMO antenna selection based on channel status information
CN103096079A (en) Multi-view video rate control method based on exactly perceptible distortion
CN104376540B (en) Bayer image denoising method
CN106683063A (en) Method and device of image denoising
CN113079319B (en) Image adjusting method and related equipment thereof
CN103473747A (en) Method and device for transmitting colors
CN103841411B (en) A kind of stereo image quality evaluation method based on binocular information processing
CN109189533A (en) A kind of screen size adaptation method and system
CN103491355A (en) Method and system for eliminating luminance and chromaticity differences of panoramic image
CN104918011B (en) A kind of method and device for playing video
CN108805943A (en) Picture code-transferring method and device
CN102984518B (en) A kind of VNC image transmission data processing method
CN102263973B (en) Adaptive displaying method for grating viewpoints based on mobile terminal
CN109167604A (en) Antenna autonomous tuning system, tuning method and communication equipment thereof
Chang et al. Hierarchical multi-scale stereoscopic image quality assessment based on visual mechanism
CN107872683A (en) Video data processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20131225