This document summarizes a technique for recovering high dynamic range radiance maps from multiple low dynamic range photographs with varying exposures. It involves constructing an aggregate mapping from sensor irradiance to pixel values using a least squares approach to solve for the unknown camera response function and irradiance values. This allows combining exposures to reduce noise and obtain radiance maps that can be used for image-based rendering with an extended dynamic range compared to a single photograph.
2. The ratio between the largest and smallest
possible values of a changeable quantity, in
this case, light.
In other words,
Range of signals within which we can operate
with acceptable distortion.
3. The human eye has a very high dynamic
range, with the ratio of brightest to darkest
signal being nearly 10,000 to 1.
In practice, it is difficult to achieve the full
dynamic range experienced by humans using
cameras.
5. To cover a wide dynamic range, we need to
combine several photographs taken at
different exposures. Essentially, a Radiance
Map is constructed.
6. Image Based Rendering and Image Based
Modeling
Image Processing, such as synthetic motion
blur.
Image Compositing, specifically for videos.
Quantitative evaluation of rendering
algorithms
7. Scene Radiance is transformed to digital pixel
values for both film and digital cameras.
The algorithm determines the aggregate
mapping from L to Z for a set of images with
different exposures.
8. Both Physical and Electronic Imaging Systems are
based on the assumption of Reciprocity.
X = EΔt
X – Sensor Exposure
E – Sensor Irradiance
Δt – Exposure Time
Only the product EΔt affects the optical density
of the processed film
9. Consider a non linear function that is
unknown:
f(X) = Z where X – exposure | Z – final pixel values
And f is a monotonically increasing invertible function
Therefore,
Zij = f(EiΔtj)
Where there are ‘i’ pixel locations and ‘j’ exposures.
10. Inverting the function,
f –1(Zij) = EiΔtj
So, ln f –1(Zij) = lnEi + lnΔtj
Assume g = ln f –1
g(Zij) = ln Ei + ln Δtj
Solve in the least-error sense for Sensor Irradiances Ei
and smooth, monotonic function g
11. The Least Squares Error and Smoothness term
have to be minimized.
In a discrete, finite world, for N pixel
locations and P photographs,
Domain of Z is finite : (Zmax – Zmin + 1)
So essentially this is a linear least-squares
problem (Single Value Decomposition)
12. Using the previous equations, the objective
function is minimized to quadratic form in
Matlab.
13. g(z) is steep and fits poorly at extremes, hence
A weighting function w(z) is introduced to emphasize
the middle areas.
Hence, Zmid = ½(Zmin + Zmax) is defined.
And:
14.
15.
16. Not every pixel site needs to be used in the
generation of the final photograph.
Due to the presence of logarithms, the algorithm
is effective only to some scale factor.
For a Z range of 255 and 11 photographs, N need
not exceed 50, however, the pixels should be
evenly distributed from Z.
To improve smoothness, g is approximated with
divided differences
17. Once g is recovered, and the exposure time is
known, the pixel values can be converted to
relative radiance values
Combining multiple exposures reduces noise
in the recovered radiance values.
18. The recovered radiance map is computed as an
array of single-precision floating point values.
Using 8 bits, only one exponent value is used
for all three color values at each pixel, which
significantly reduces storage space for the
image.
19. To recover the film response curve, a minimum
of 2 images are required. The 2 images must
have similar exposure values. Using more images
improves the result.
To recover a radiance map from the film
response curve, the number of images needed
increases with the range of radiance values, and
decreases with the dynamic range of the images.
An extended dynamic range image can be
obtained from a single exposure by manipulating
the brightness and density adjustment.
20. The response curve is constructed for the
red, green and blue channels separately.
However, 3 unknown scaling factors are
needed to relate relative radiance to absolute
radiance.
Changing the scaling factors changes the
color balance of the final image.
21. The blue curve is slightly difference, since the darkened regions of
the image tend to display a blue cast.