[go: up one dir, main page]

0% found this document useful (0 votes)
32 views66 pages

T3P5

The error correcting illumination pattern is created by analyzing the depth maps obtained from the initial ensemble of patterns. Pixels where the depth measurements disagree across the initial patterns are identified as error pixels. A new pattern is then designed specifically for just those error pixels by minimizing the effects of global illumination for those pixels. This allows getting clean measurements for just the problematic pixels to resolve the errors.

Uploaded by

Suneel Pandit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views66 pages

T3P5

The error correcting illumination pattern is created by analyzing the depth maps obtained from the initial ensemble of patterns. Pixels where the depth measurements disagree across the initial patterns are identified as error pixels. A new pattern is then designed specifically for just those error pixels by minimizing the effects of global illumination for those pixels. This allows getting clean measurements for just the problematic pixels to resolve the errors.

Uploaded by

Suneel Pandit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 66

Structured Light 3D Scanning

In the presence of Global Illumination

Mohit Gupta Amit Agrawal


Srinivasa G. Narasimhan Ashok Veeraraghavan

Presented by Noranart Vesdapunt, Utkarsh Sinha

Slide adapted from Srinivasa et al., Nayar et al.


Structured light
Structured Light
Structured Light

Two views, one projector


Different kinds of patterns
Binary

Grayscale

Colour

As fast as 4000 Hz
As precise as 30 µ m
3D structure Light

image from http://www.3ders.org/


3D structure Light (This Paper)

image from http://www.3ders.org/


Issues with Structured Light

Structured light is highly dependent on finding


exact correspondences

This would fail if:


Illuminated pixels are occluded

Camera Defocus blur

Unexpected illumination due to GI


Illuminated pixels are occluded

Camera

Pattern projector
Camera Defocus Blur
Issues with Structured Light

Structured light is highly dependent on finding


exact correspondences

This would fail if:


Illuminated pixels are occluded

Defocus blur

Unexpected illumination due to GI


Issues due to GI
Light Transport

Volumetric Inter-reflections
Scattering

Direct Illumination Sub-surface


scattering
Light Source

Scene

12
Bowl on a Marble Slab
Pattern with Different Frequencies
Low Frequency: Interreflections
Long range effects
High Frequency: Subsurface Scattering
Short range effects
Combine Both
This Paper
Formulating the problem
V-Groove Scene

Inter-reflections
Conventional Gray codes

Low frequency pattern Inverse Pattern

Pattern Edge
L = 0.16 L = 0.25

Captured Image
Captured Image
Binarization error (long-range effects)

Errors due to inter-reflections

Incorrect Binarization Ground-truth Binarization

One (illuminated) Zero (not-illuminated)


Point Light Source Illuminating the Scene

Source

Surface

i
Camera

L[i] = α Ld [i] + β Lg [i]


Direct Global
Component Component
Point Light Source Illuminating the Scene

Low frequency pattern Inverse Pattern

L[i] = α Ld [i] + β Lg [i]


L[i] = (1-α) Ld [i] + (1-β) Lg [i]
Point Light Source Illuminating the Scene

L[i] = α Ld [i] + β Lg [i]


Defocus
Parameter

no projector defocus blur: α = 1


L[i] = Ld [i] + β Lg [i]
L[i] = (1-β) Lg [i]
L[i] > L[i]
Incorrect decoding for low-frequencies

Pattern
Edge L= 0.16 L = 0.25

Captured Image Captured Image

L = Direct + β • Global L = (1 – β) • Global

β ∼= 0, Direct < Global L <


L
Binarization for high-frequency pattern

Pattern Inverse Pattern

L = 0.25 L = 0.16

Captured Image Captured Image

L = Direct + 0.5 Global > L = 0.5 Global


Long Range Effects
High-frequency patterns

Decoded correctly

Captured Image Binary Decoding


Short Range Effects
Fixing these problem
Key Ideas

Inter-reflections: Use high frequency patterns

Subsurface scattering: Use low frequency patterns

Design a system to deal with both simultaneously


Fixing Long Range Effects

Replace low frequency patterns with high


frequency patterns
Fixing Long Range Effects

Binarization Binarization
Binarization

Incorrect Binarization

Correct Binarization
Fixing Long Range Effects

Producing patterns

=
xor
Low frequency Last pattern New pattern

While scanning

=
xor
New pattern Last pattern Original pattern
Fixing Long Range Effects

XOR’d with the last pattern


Maximum stripe width of 2 pixels

Called XOR-02

XOR’d with the second last pattern


Maximum stripe width of 4 pixels

Called XOR-04
Fixing Long Range Effects

Conventional Gray Codes (11 images) XOR-04 Codes (11 images)

1200
1100 Gray Codes

1000 XOR-04 Codes


Depth (mm)

900
800
700

600
0 200 400 600 800
Pixels
Fixing Long Range Effects

Diffusion +
Inter-reflections
Fixing Long Range Effects

Regular Gray Codes (11 images) XOR Codes (11 images)


Fixing Long Range Effects

Pro: No additional overhead


Pro: Perfect on shots with only inter-reflections

Con: Thin stripes can succumb to defocus blur


Con: Short range effects are magnified
Subsurface scattering makes patterns blurred
Fixing Short Range Effects

Maximize the smallest stripe width


Fixing Short Range Effects

Can be posed as a mathematical question


Binary gray codes with long bit runs
Fixing Short Range Effects

Pro: Immune to subsurface scattering


Pro: No additional overhead
Pro: Less prone to defocus blur
Con: Succumbs to inter-reflections
Can we do better?

subsurface scattering

Conventional Gray (10 images) Max min-SW Gray (10 images)


inter-reflections

XOR-04 (10 images) XOR-02 (10 images)


An ensemble of patterns
An ensemble of patterns

We don’t know what the scene contains


Construct a system immune to these effects
Fast

Accurate

Project all four light patterns


Calculate depth maps from all four
Fuse data from all depth maps
An ensemble of patterns

All four of them will probably not agree


The errors can be considered random
Light bounces around depending on the scene

If two depths are similar, that is the correct depth


An ensemble of patterns

What if none of the depths match?


All four depth measurements disagree

Construct a pattern for just the error pixels


This reduces the error due to GI

Estimate depth for error pixels using new


patterns
Loop until all pixels are resolved
An ensemble of patterns

Strong and high-frequency inter-reflections


An ensemble of patterns

Regular Gray (11 images) Ensemble Codes (41 images) Error pixels
An ensemble of patterns

Error map

Illumination Mask

Illumination Mask Illumination Mask


(Iteration 2) (Iteration 3)
An ensemble of patterns

Iterative improvement

Only regular pattern Ensemble Iteration 2 Iteration 3


(11 images) (41 images) (61 images) (81 images)
An ensemble of patterns

Conventional gray code Ensemble Codes Error Correction:


(11 images) (41 images) 2 iterations (81 images)
An ensemble of patterns

Pro: Accurate 3D reconstruction


Con: Requires more acquisition time
Can be fixed with colored patterns

Still better than other approaches


(about ~2.5x fewer images)
Colored Pattern
Limitations

Volumetric scattering
Acquisition speed (binary code)
Violation of low and high frequency pattern
condition

Classification of indirect illumination


Conclusion / Score

Full Score - 1
Good methodology
Strong reasoning
Clear Limitation and future works

Creative Idea
Q&A

60
Further Research

How do we design patterns?

Xida Chenn , Yee-Hong Yang (2015)


More results: Depth Map
More results: 3D Visualization
More results: Depth Map
More results: 3D Visualization
Extra slide

How is the error correcting illumination pattern


created for the projector’s point of view?

You might also like