Image Blending and Compositing
© NASA
15-463: Computational Photography
Alexei Efros, CMU, Fall 2011
Image Compositing
Compositing Procedure
1. Extract Sprites (e.g using Intelligent Scissors in Photoshop)
2. Blend them into the composite (in the right order)
Composite by
David Dewey
Need blending
Alpha Blending / Feathering
+
1 1
0 0
Iblend = αIleft + (1-α)Iright
=
Affect of Window Size
1 left 1
right
0 0
Affect of Window Size
1 1
0 0
Good Window Size
“Optimal” Window: smooth but not ghosted
What is the Optimal Window?
To avoid seams
• window = size of largest prominent feature
To avoid ghosting
• window <= 2*size of smallest prominent feature
Natural to cast this in the Fourier domain
• largest frequency <= 2*size of smallest frequency
• image frequency content should occupy one “octave” (power of two)
FFT
What if the Frequency Spread is Wide
FFT
Idea (Burt and Adelson)
• Compute Fleft = FFT(Ileft), Fright = FFT(Iright)
• Decompose Fourier image into octaves (bands)
– Fleft = Fleft1 + Fleft2 + …
• Feather corresponding octaves Flefti with Frighti
– Can compute inverse FFT and feather in spatial domain
• Sum feathered octave images in frequency domain
Better implemented in spatial domain
Octaves in the Spatial Domain
Lowpass Images
Bandpass Images
Pyramid Blending
0
1
Left pyramid blend Right pyramid
Pyramid Blending
laplacian
level
4
laplacian
level
2
laplacian
level
0
left pyramid right pyramid blended pyramid
Laplacian Pyramid: Blending
General Approach:
1. Build Laplacian pyramids LA and LB from images A and B
2. Build a Gaussian pyramid GR from selected region R
3. Form a combined pyramid LS from LA and LB using nodes
of GR as weights:
• LS(i,j) = GR(I,j,)*LA(I,j) + (1-GR(I,j))*LB(I,j)
4. Collapse the LS pyramid to get the final blended image
Blending Regions
Horror Photo
© david dmartin (Boston College)
Results from this class (fall 2005)
© Chris Cameron
Season Blending (St. Petersburg)
Season Blending (St. Petersburg)
Simplification: Two-band Blending
Brown & Lowe, 2003
• Only use two bands: high freq. and low freq.
• Blends low freq. smoothly
• Blend high freq. with no smoothing: use binary alpha
2-band Blending
Low frequency (λ > 2 pixels)
High frequency (λ < 2 pixels)
Linear Blending
2-band Blending
Don’t blend, CUT!
Moving objects become ghosts
So far we only tried to blend between two images.
What about finding an optimal seam?
Davis, 1998
Segment the mosaic
• Single source image per segment
• Avoid artifacts along boundries
– Dijkstra’s algorithm
Minimal error boundary
overlapping blocks vertical boundary
2
_ =
overlap error min. error boundary
Seam Carving
http://www.youtube.com/watch?v=6NcIJXTlugc
Graphcuts
What if we want similar “cut-where-things-
agree” idea, but for closed regions?
• Dynamic programming can’t handle loops
Graph cuts – a more general solution
hard
t
n-links a cut
constraint
hard
constraint
s
Minimum cost cut can be computed in polynomial time
(max-flow/min-cut algorithms)
Kwatra et al, 2003
Actually, for this example, DP will work just as well…
Lazy Snapping
Interactive segmentation using graphcuts
Gradient Domain
In Pyramid Blending, we decomposed our
image into 2nd derivatives (Laplacian) and a
low-res image
Let us now look at 1st derivatives (gradients):
• No need for low-res image
– captures everything (up to a constant)
• Idea:
– Differentiate
– Blend / edit / whatever
– Reintegrate
Gradient Domain blending (1D)
bright
Two
signals
dark
Regular Blending
blending derivatives
Gradient Domain Blending (2D)
Trickier in 2D:
• Take partial derivatives dx and dy (the gradient field)
• Fidle around with them (smooth, blend, feather, etc)
• Reintegrate
– But now integral(dx) might not equal integral(dy)
• Find the most agreeable solution
– Equivalent to solving Poisson equation
– Can be done using least-squares
Perez et al., 2003
Perez et al, 2003
editing
Limitations:
• Can’t do contrast reversal (gray on black -> gray on white)
• Colored backgrounds “bleed through”
• Images need to be very well aligned
Gradients vs. Pixels
Can we use this for range compression?
White?
White?
White?
Thinking in Gradient Domain
Our very own Jim McCann::
James McCann
Real-Time Gradient-Domain Painting,
SIGGRAPH 2009
Gradient Domain as Image Representation
See GradientShop paper as good example:
http://www.gradientshop.com/
Can be used to exert
high-level control over images
Can be used to exert
high-level control over images
gradients – low level image-features
Can be used to exert
high-level control over images
gradients – low level image-features
pixel +100
gradient
Can be used to exert
high-level control over images
gradients – low level image-features
gradients – give rise to high level image-features
pixel +100
gradient
Can be used to exert
high-level control over images
gradients – low level image-features
gradients – give rise to high level image-features
pixel +100
gradient
+100
+100
+100
+100
Can be used to exert
high-level control over images
gradients – low level image-features
gradients – give rise to high level image-features
pixel +100
gradient
+100
+100
+100
+100
image edge image edge
Can be used to exert
high-level control over images
gradients – low level image-features
gradients – give rise to high level image-features
manipulate local gradients to
manipulate global image interpretation
pixel +100
gradient
+100
+100
+100
+100
Can be used to exert
high-level control over images
gradients – low level image-features
gradients – give rise to high level image-features
manipulate local gradients to
manipulate global image interpretation
pixel +255
gradient
+255
+255
+255
+255
Can be used to exert
high-level control over images
gradients – low level image-features
gradients – give rise to high level image-features
manipulate local gradients to
manipulate global image interpretation
pixel +255
gradient
+255
+255
+255
+255
Can be used to exert
high-level control over images
gradients – low level image-features
gradients – give rise to high level image-features
manipulate local gradients to
manipulate global image interpretation
pixel +0
gradient
+0
+0
+0
+0
Can be used to exert
high-level control over images
gradients – low level image-features
gradients – give rise to high level image-features
manipulate local gradients to
manipulate global image interpretation
pixel +0
gradient
+0
+0
+0
+0
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
object boundaries
depth discontinuities
shadows
…
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
visual richness
surface properties
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
lighting
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
lighting
shape
sculpting the face
using shading (makeup)
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
lighting
shape
sculpting the face
using shading (makeup)
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
lighting
shape
sculpting the face
using shading (makeup)
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
lighting
shape
sculpting the face
using shading (makeup)
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
Artifacts
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
Artifacts
noise
sensor
noise
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
Artifacts
noise
seams
seams in
composite images
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
Artifacts
noise
seams
compression
artifacts
blocking in
compressed images
Can be used to exert
high-level control over images
gradients – give rise to high level image-features
Edges
Texture
Shading
Artifacts
noise
seams
compression
artifacts
ringing in
compressed images
Can be used to exert
high-level control over images
Optimization framework
Pravin Bhat et al
Optimization framework
Input unfiltered image – u
Optimization framework
Input unfiltered image – u
Output filtered image – f
Optimization framework
Input unfiltered image – u
Output filtered image – f
Specify desired pixel-differences – (gx, gy)
Energy function
min (fx – gx)2 + (fy – gy)2
f
Optimization framework
Input unfiltered image – u
Output filtered image – f
Specify desired pixel-differences – (gx, gy)
Specify desired pixel-values – d
Energy function
min (fx – gx)2 + (fy – gy)2 + (f – d)2
f
Optimization framework
Input unfiltered image – u
Output filtered image – f
Specify desired pixel-differences – (gx, gy)
Specify desired pixel-values – d
Specify constraints weights – (wx, wy, wd)
Energy function
min wx(fx – gx)2 + wy(fy – gy)2 + wd(f – d)2
f
Pseudo image relighting
change scene illumination
in post-production
example
input
Pseudo image relighting
change scene illumination
in post-production
example
manual relight
Pseudo image relighting
change scene illumination
in post-production
example
input
Pseudo image relighting
change scene illumination
in post-production
example
GradientShop relight
Pseudo image relighting
change scene illumination
in post-production
example
GradientShop relight
Pseudo image relighting
change scene illumination
in post-production
example
GradientShop relight
Pseudo image relighting
change scene illumination
in post-production
example
GradientShop relight
Pseudo image relighting
u o
f
Pseudo image relighting
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
u o
f
Pseudo image relighting
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u o
d= u
f
Pseudo image relighting
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u o
d= u
gx(p) = ux(p) * (1 + a(p))
a(p) = max(0, - u(p).o(p))
f
Pseudo image relighting
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u o
d= u
gx(p) = ux(p) * (1 + a(p))
a(p) = max(0, - u(p).o(p))
f
Sparse data interpolation
Interpolate scattered data
over images/video
Sparse data interpolation
Interpolate scattered data
over images/video
Example app: Colorization*
input output
*Levin et al. – SIGRAPH 2004
Sparse data interpolation
u user data
f
Sparse data interpolation
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
u user data
f
Sparse data interpolation
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u user data
d = user_data
f
Sparse data interpolation
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u user data
d = user_data
if user_data(p) defined
wd(p) = 1
else
wd(p) = 0
f
Sparse data interpolation
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u user data
d = user_data
if user_data(p) defined
wd(p) = 1
else
wd(p) = 0
gx(p) = 0; gy(p) = 0
f
Sparse data interpolation
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u user data
d = user_data
if user_data(p) defined
wd(p) = 1
else
wd(p) = 0
gx(p) = 0; gy(p) = 0
wx(p) = 1/(1 + c*|ux(p)|)
wy(p) = 1/(1 + c*|uy(p)|) f
Sparse data interpolation
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u user data
d = user_data
if user_data(p) defined
wd(p) = 1
else
wd(p) = 0
gx(p) = 0; gy(p) = 0
wx(p) = 1/(1 + c*|ux(p)|)
wy(p) = 1/(1 + c*|uy(p)|) f
Sparse data interpolation
Energy function
min wx(fx – gx)2 +
f wy(fy – gy)2 +
wd(f – d)2
Definition: u user data
d = user_data
if user_data(p) defined
wd(p) = 1
else
wd(p) = 0
gx(p) = 0; gy(p) = 0
wx(p) = 1/(1 + c*|el(p)|)
wy(p) = 1/(1 + c*|el(p)|) f