SlideShare a Scribd company logo
1 of 51
Download to read offline
i
IMAGE SENSING AND AQUISITION
SEMINAR REPORT
SUBMITTED IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
BACHELOR OF TECHNOLOGY
IN
ELECTRONICS AND COMMUNICATION ENGINEERING
BY
O.V.S SHASHANK RAM (12SS1A0446)
Department of Electronics and Communication Engineering
Jawaharlal Nehru Technological University Hyderabad
College of Engineering
Sultanpur, pulkal (M), Medak-502293 Telangana
2016
ii
Jawaharlal Nehru Technological University Hyderabad
College of Engineering
Sultanpur, pulkal (M),Medak-502293 Telangana
Department of Electronics and Communication Engineering
CERTIFICATE
Date:
This is to certify that the seminar work entitled “IMAGE SENSING AND
AQUISITION” is a bonafide work carried out by “O.V.S SHASHANK RAM” bearing
Roll no.12SS1A0446 in partial fulfillment of the requirements of the degree of
BACHELOR OF TECHNOLOGY in ELECTRONICS & COMMUNICATION
ENGINEERING by the Jawaharlal Nehru Technological University, Hyderabad during
the academic year 2015-16.
The results embodied in this report have not been submitted to any other
University or Institution for the award of any degree or diploma.
------------------------- ---------------------
Mr. V. Rajanesh Mr. B. Prabhakar
Associate Professor Associate Professor
Guide Head of the Department
iii
Abstract
The inclusion of cameras in everything from cell phones to pens to children’s’
toys is possible because of the low cost and low power consumption of the imaging
arrays that form the core of the cameras. However, these arrays are low cost and low
power because they are CMOS-based; this allows for the devices to be made with the
same processes and facilities that are used to make memory and computer chips. Yet, the
continued surge in CMOS imager popularity goes beyond the lower cost to other factors
such as ability to integrate the sensors with electronics, and the ability to achieve fast,
customizable frame rates.
People have been using camera and film for more than 100 years, both for still
photography and movies. There is something magical about the process -- humans are
visual creatures, and a picture really does paint a thousand words for us!
In this report we will be discussing how an image is sensed and the acquitted,
what is a camera, how it works. Various image sensors are discussed and the differences
these sensors, the differences between the analog and digital image sensing is also
discussed in detail.
iv
Contents :
1. Introduction 1
2. Camera 2
3. History of Camera 3
4. Components of a Camera 6
5. Working of a Camera 13
6. Image Sensors 20
7. CCD vs CMOS 28
8. Anolog(Film) vs Digital 38
9. Conclusion 46
References 47
1
Introduction:
Before any video or image processing can commence an image must be
captured by a camera and converted into a manageable entity. This is whole process is
termed as image sensing and acquisition.
Image Sensing and acquisition mainly deals with how the image is sensed and
then acquired as desired. This the first and foremost in the elements of digital image
processing. Only after this important step can we implement any processing on the
image.
Mostly the electronic device associated with image sensing and acquisition is
called as a Camera.
As a photographer I have always been interested in the camera as a medium.
Darwin’s writings on the evolution of the eye in his “On the Origin of Species” and
was struck (as many have been) by its remarkable similarity to the development of
photography. Nature did not create the eye fully formed—Darwin demonstrated how
this occurred over countless generations in a slow methodical process. It started with a
flat disk of light sensitive cells that could detect the presence of light but nothing
more. This disk began to dimple which grew deeper to form a cup. As the cup closed
over the opening formed an aperture which had enough power to resolve a dim fuzzy
image onto the back of the proto eye (still used today by the Nautaist). This cup filled
with mucus and an encapsulated lens formed over the opening. We now had the
compound eye, a very sharp and bright camera for seeing the world.
The analogy to human made cameras is not perfect. We did not progress
slowly by dimpling light sensitive paper until it eventually became a Nikon. But
cameras have evolved on their own trajectory. Humans were playing with pinhole
camera obscuras as written about in ancient Chinese and Greek text, and we had
simple lens camera lucida’s by the in the early renascence. So what was the first
human made camera? What was equivalent to the flat disk of light sensitive cells?
Many theorists have written about the analogy of Plato’s cave in which people trace
the shadows cast on the back wall of a cave. The first human made proto cameras
could have been present with the first proto humans who watched their shadows
dance across the cave walls cast from their fire pits.
Photography is undoubtedly one of the most important inventions in history --
it has truly transformed how people conceive of the world. Now we can "see" all sorts
of things that are actually many miles -- and years -- away from us. Photography lets
us capture moments in time and preserve them for years to come.
First let us discuss how a camera operates, the history of the camera
technology and as of how the name is originated.
Then we will be discussing what is an image sensor and different types of
image sensors and their basic operation and finally how an color image is formed.
2
Camera:
With a rather gentle introduction, we ask ourselves what a camera really is,
and what its different components are. Chances are that you will already know some
of this, but going through it anyway will at least ensure that we have defined a
common vocabulary.
In the strictest sense, it is simply a device which can record light. It does so by
focusing light on a photosensitive surface. From this simple sentence, we can see the
three main parts of any camera.
The basic technology that makes all of this possible is fairly simple. A still film
camera is made of three basic elements: an optical element (the lens), a chemical
element (the film) and a mechanical element (the camera body itself). As we'll see,
the only trick to photography is calibrating and combining these elements in such a
way that they record a crisp, recognizable image.
3
History of Camera:
1500 – Camera Obscura
The first pinhole camera (also called the Camera Obscura)
was invented by Alhazen (Ibn Al-Haytham).
Camera Obscura(Latin origin) implies a vaulted(closed)
room.Basically a pin hole camera should be in such a way
that light enters only through the hole.
1839-Daguerreotype Camera
The Daguerreotype Camera was announced by the
French Academy of Sciences. One of these inventions is
now the world’s most expensive cameras.
1840-First Patent
The first American patent issued in photography to
Alexander Wolcott for his camera.
1859-Panoramic camera
The panoramic camera patented by
Thomas Sutton.
1861-Stereoscope viewer
Oliver Wendell Holmes invents stereoscope
viewer.
1888
George Eastman patents Kodak roll-film
4
camera. Eastman was a pioneer in photographic films usage. He also started
manufacturing paper films in 1885. His first Kodak box camera was very simple and
very cheap.
1900
First mass-marketed camera – the Brownie was
presented by Eastman. It was on sale until 1960s.
1900
The Raisecamera
(travel camera) was invented. Extreme light weight
and small dimensions when it is folded made this
photo camera the most desirable thing for landscape
photographers.
5
1913/1914
The first 35mm still camera (also called
�candid� camera ) developed by Oskar
Barnack of German Leica Camera. Later it
became the standard for all film cameras.
1948
Edwin Land invented the Polaroid camera which
could take a picture and print it in about one
minute.
1960
EG&G develops extreme depth underwater
camera for U.S. Navy.
1978
Konica introduces the first point-and-shoot, autofocus camera Konica C35 AF. It was
named “Jasupin”.
1981
Sony demonstrates the Sony Mavica – the
world’s first digital electronic still camera.
1986
Fuji introduced the disposable camera. The
inventors also call this device “single-use cameras”.
1991
Kodak released the first professional digital camera
system (DCS) which was of a great use for
photojournalists. It was a modified Nikon F-3 camera
with a 1.3 megapixel sensor.
6
1994-1996
The first digital cameras for the consumer-level market that worked with a home
computer via a serial cable were the Apple QuickTake 100 camera (February 17 ,
1994), the Kodak DC40 camera (March 28, 1995), the Casio QV-11 (with LCD
monitor, late 1995), and Sony’s Cyber-Shot Digital Still Camera (1996).
2000
In Japane Sharp’s J-SH04 introduced the world’s
first camera phone.
2005
The Canon EOS 5D is launched. This is first
consumer-priced full-frame digital SLR with a 24x36mm CMOS sensor.
7
Components of Camera:
Camera
consists of :
1.Lens
2.Aperture
3.Shutter
4.Photo
Sensing
Element
5.Buffer
6.ISP
8
Debriefed:
The photosensitive surface reacts to light through either a chemical process
(film) or an electric one (digital sensor). There are fundamental differences between
these two, which we will cover in a subsequently, but for now we can consider both of
them to be identical: they are a grid of several million tiny dots (pixels) and each can
remember how much light it received in a given period of time. There are three
important qualities to each sensor: resolution, size and what we can call “quality”.
 Resolution is simply the number of pixels (it is slightly more complicated with
film, let’s forget about it for now). The more pixels you have, the more fine
grained details you can theoretically record. Any resolution above 2 or 3
megapixels (i.e. millions of pixels) will be enough for displaying on a screen,
but higher resolutions come into play for two important applications: printing
and cropping.
o In order to have a good reproduction quality, it is generally estimated
that between 240 and 300 pixels should be used for every inch of paper
(dots per inch, or dpi), which will give a natural limitation to the
biggest size one can print. For instance, a 6MP image of dimensions
2000×3000 pixels can be printed at a maximum size of 12.5×8.3″ at
240dpi (2000/240 = 8.3, 3000/240 = 12.5). It is possible to print bigger
by either lowering the dpi or artificially increasing the resolution, but
this will come at a serious loss of image quality. Having a higher
resolution allows you to print bigger.
o Cropping means reducing the size of an image by discarding pixels on
the sides. It is a very useful tool and can often improve composition or
remove unwanted elements from an image. However, it will also
decrease resolution (since you lose pixels), so how much cropping you
allow yourself will depend on the initial resolution, which you want to
be as high as possible. This is also what some cheaper cameras call
“digital zoom”, which use should be avoided as the plague, as the same
effect can very easily be reproduced in post-processing, and the loss of
image quality is often enormous.
 The physical size of the sensor is very important and will have an impact on
many other parameters, most of which we will see in subsequent lessons: crop
factor, depth of field, high ISO noise, dynamic range are some of them. Bigger
sensors will also allow to have more widely spaced pixels (increasing image
quality) or more of them (increasing resolution). Bigger is almost always
better, and this is one of the main reasons that DSLRs (and medium format
cameras) produce much better images than compact cameras. In tomorrow’s
lesson, we will cover the different types of cameras in more details.
 Finally, sensor quality is harder to quantify, but it refers to how well the sensor
reacts to difficult light conditions: either low light which will require to
increase ISO and for which we want the sensor to have as little noise as
possible, or high contrast, which will require a good dynamic range to be
recorded adequately.
9
The lens is the second
component of any camera. It is an
optical device which takes scattered
light rays and focuses them neatly
on the sensor. Lenses are often
complex, with up to 15 different
optical elements serving different
roles. The quality of the glass and
the precision of the lens will be
extremely important in determining
how good the final image is.
Lenses must make
compromises, and a perfect all
around lens is physically
impossible to build. For this reason,
good lenses tend to be specialized
and having the ability to switch
them on your camera will prove
extremely useful.
Lenses usually come with cryptic sequences of symbols and numbers which
describe their specifications. Without going too much into details, let’s review some
of their characteristic:
 Focal length refers roughly to the “zoom level”, or angle of view, of the lens.
It will have its own lesson in a few days, as it can be a surprisingly tricky
subject. A focal length is usually expressed in millimeters, and you should be
aware that the resulting angle of view actually depends on the size of the
sensor of the camera on which the lens is used (this is called the crop factor).
For this reason, we often give “35mm equivalent” focal lengths, which is the
focal length that would offer the same view on a 35mm camera (the historic
film SLR format) and allows us to make meaningful comparisons. If there is a
single length (e.g. 24mm), then the lens doesn’t zoom, and it is often called a
prime lens. If there are two numbers (e.g. 18-55mm), then you can use the lens
at any focal in that range. Compact cameras often don’t give focal lengths but
simply the range, for instance 8x. This means that the long end is 8 times
longer than the wide one, so the lens could for instance be a 18-144mm, or a
35-280mm, etc.
 The aperture is a very important concept which we will talk about in much
detail later on. The aperture is an iris in the centre of the lens which can close
to increasingly small sizes, limiting the amount of light which gets on the
sensor. It is refered to as a f-number, for instance f/2.8. To make things worse,
it is quite counter-intuitive, as the smaller the number, the bigger the aperture!
For now, we don’t have to worry about this too much. The important number
on a lens is the maximal aperture, the lower the better. Professional zoom
10
lenses often have f/2.8 maximal apertures, and cheaper consumer lenses have
ranges such as f/3.5-5.6, meaning that at the wide end, the maximum aperture
is f/3.5 and at the long end, it is f/5.6. Aperture can be closed to tiny levels,
usually at least f/22.
 Lenses also need a focusing system. Nowadays, most lenses have an internal
motor which can be piloted by the camera: the autofocus. They also have a
ring to allow the photographer to focus manually. There are plenty of options
for autofocus motors as well, for instance hypersonic or silent ones.
 Lenses are increasingly equiped with stabilisation systems (called VR by
Nikon, IS by Canon). They detect small movements, usually handshake, and
compensate for them by moving internally the optical elements in the opposite
direction. Though no magic pills, those systems tend to work very well and
allow to take sharp images at quite slow shutter speeds.
 Finally, lenses can have all sorts of fancy options: apochromatic glass, nano-
coating, etc, designed to increase the quality of the final image. You probably
shouldn’t worry too much about those.
Finally, the body is the light tight box connecting the lens to the sensor, and
ordering everyone around. Though some film cameras are just that, black boxes, most
digital cameras are now small computers, sporting all sorts of features, often of
dubious usefulness. Let’s review some of the components found in most bodies:
 The most
important is probably
the shutter. Think of it
as a curtain in front of
the sensor. When you
press the trigger, the
curtain opens, exposes
the sensor to light from
the lens, then closes
again after a very
precise amount of time,
often a tiny fraction of a
second. Most shutters
operate between 30
seconds and 1/4000s of
a second. That duration
(the shutter speed) is one of the three very important exposure factors, along
with aperture and ISO.
 A light meter. As the name suggests, it measures the quantity of light and sets
the exposure accordingly. How much manual control you keep at this stage is
one of the most important questions in photography. There are different
metering modes, but except in very specific cases, using the most advanced,
most automated one (matrix metering on Nikon cameras) will provide the best
results.
 A focus detector, used to drive the autofocus motor in the lens. There are two
competing technologies, contrast detection and phase detection, with at the
11
moment an edge for the latter, which explains why DSLRs tend to focus faster
than compact cameras.
 A way to store the image just created. Back in the days of film, this was just a
lever to advance the roll to the next unexposed frame. Now, it is a pipeline
which ends up in the memory card that the camera is using.
 A way to frame. It can be a multitude of things, optical or electronic
viewfinder, LCD screen or even ground glass.
The optical component of
the camera is the lens. At
its simplest, a lens is just
a curved piece of glass or
plastic. Its job is to take
the beams of light
bouncing off of an object
and redirect them so they
come together to form
a real image -- an image
that looks just like the
scene in front of the lens.
But how can a
piece of glass do this?
The process is actually
very simple.
As light travels from one medium to another, it changes speed. Light travels more
quickly through air than it does through glass, so a lens slows it down.
When light waves enter a piece of glass at an angle, one part of the wave will
reach the glass before another and so will start slowing down first. This is something
like pushing a shopping
cart from pavement to
grass, at an angle. The
right wheel hits the grass
first and so slows down
while the left wheel is
still on the pavement.
Because the left wheel is
briefly moving more
quickly than the right
wheel, the shopping cart
turns to the right as it
moves onto the grass.
The effect on
light is the same -- as it
enters the glass at an
angle, it bends in one
12
direction. It bends again when it exits the glass because parts of the light wave enter
the air and speed up before other parts of the wave. In a standard converging,
or convex lens, one or both sides of the glass curves out. This means rays of light
passing through will bend toward the center of the lens on entry. In adouble convex
lens, such as a magnifying glass, the light will bend when it exits as well as when it
enters.
This effectively reverses the path of light from an object. A light source -- say
a candle -- emits light in all directions. The rays of light all start at the same point --
the candle's flame -- and then are constantly diverging. A converging lens takes those
rays and redirects them so they are all converging back to one point. At the point
where the rays converge, you get a real image of the candle. In the next couple of
sections, we'll look at some of the variables that determine how this real image is
formed.
Cameras: Focus
We've seen that
a real image is formed by
light moving through a
convex lens. The nature of
this real image varies
depending on how
the light travels through the
lens. This light path depends
on two major factors:
 The angle of the
light beam's entry into the
lens
 The structure
of the lensThe angle of light
entry changes when you
move the object closer or
farther away from the lens.
You can see this in the
diagram below. The light
beams from the pencil point
enter the lens at a sharper
angle when the pencil is
closer to the lens and a more
obtuse angle when the
pencil is farther away. But
overall, the lens only bends
the light beam to a certain
total degree, no matter how
it enters. Consequently, light
beams that enter at a sharper
angle will exit at a more
obtuse angle, and vice versa.
13
The total "bending angle" at any particular point on the lens remains constant.
As you can see, light beams from a closer point converge farther away from
the lens than light beams from a point that's farther away. In other words, the real
image of a closer object forms farther away from the lens than the real image from a
more distant object.
You can observe this phenomenon with a simple experiment. Light a candle in
the dark, and hold a magnifying glass between it and the wall. You will see an upside
down image of the candle on the wall. If the real image of the candle does not fall
directly on the wall, it will appear somewhat blurry. The light beams from a particular
point don't quite converge at this point. To focus the image, move the magnifying
glass closer or farther away from the candle.
This is what you're doing when you turn the lens of a camera to focus it --
you're moving it closer or farther away from the film surface. As you move the lens,
you can line up the focused real image of an object so it falls directly on the film
surface.
You now know that at any one point, a lens bends light beams to a certain total
degree, no matter the light beam's angle of entry. This total "bending angle" is
determined by the structure of the lens.
14
Camera Lenses:
A standard
50 mm lens doesn't
significantly shrink
or magnify the
image.
In the last
section, we saw that
at any one point, a
lens bends light
beams to a certain
total degree, no
matter the light
beam's angle of entry.
This total "bending
angle" is determined
by the structure of the
lens.
A lens with a
rounder shape (a
center that extends
out farther) will have
a more acute bending angle. Basically, curving the lens out increases the distance
between different points on the lens. This increases the amount of time that one part
of the light wave is moving faster than another part, so the light makes a sharper turn.
Increasing the bending angle has an obvious effect. Light beams from a
particular point will converge at a point closer to the lens. In a lens with a flatter
shape, light beams will not turn as sharply. Consequently, the light beams will
converge farther away from the lens. To put it another way, the focused real image
forms farther away from the lens when the lens has a flatter surface.
Increasing the distance between the lens and the real image actually increases
the total size of the real image. If you think about it, this makes perfect sense. Think
of a projector: As you move the projector farther away from the screen, the image
becomes larger. To put it simply, the light beams keep spreading apart as they travel
toward the screen.
The same basic thing happens in a camera. As the distance between the lens
and the real image increases, the light beams spread out more, forming a larger real
image. But the size of the film stays constant. When you attach a very flat lens, it
projects a large real image but the film is only exposed to the middle part of it.
Basically, the lens zeroes in on the middle of the frame, magnifying a small section of
the scene in front of you. A rounder lens produces a smaller real image, so the film
surface sees a much wider area of the scene (at reduced magnification).
15
Professional cameras let you attach different lenses so you can see the scene at
various magnifications. The magnification power of a lens is described by itsfocal
length. In cameras, the focal length is defined as the distance between the lens and the
real image of an object in the far distance (the moon for example). A higher focal
length number indicates a greater image magnification.
Different lenses are suited to different situations. If you're taking a picture of a
mountain range, you might want to use a telephoto lens, a lens with an especially
long focal length. This lens lets you zero in on specific elements in the distance, so
you can create tighter compositions. If you're taking a close-up portrait, you might use
a wide-angle lens. This lens has a much shorter focal length, so it shrinks the scene in
front of you. The entire face is exposed to the film even if the subject is only a foot
away from the camera. A standard 50 mm camera lens doesn't significantly magnify
or shrink the image, making it ideal for shooting objects that aren't especially close or
far away.
Lenses in the Lens
A camera lens is actually several lenses combined into one unit. A single
converging lens could form a real image on the film, but it would be warped by a
number of aberrations.
One of the most significant warping factors is that different colors of light
bend differently when moving through a lens. This chromatic aberrationessentially
produces an image where the colors are not lined up correctly.
Cameras compensate for this using several lenses made of different materials.
The lenses each handle colors differently, and when you combine them in a certain
way, the colors are realigned.
In a zoom lens, you can move different lens elements back and forth. By
changing the distance between particular lenses, you can adjust the magnification
power -- the focal length -- of the lens as a whole.
Cameras: Recording Light
The chemical component in a traditional camera is film. Essentially, when you
expose film to a real image, it makes a chemical record of the pattern of light.
It does this with a collection of tiny light-sensitive grains, spread out in a
chemical suspension on a strip of plastic. When exposed to light, the grains undergo a
chemical reaction.
Once the roll is finished, the film is developed -- it is exposed to other
chemicals, which react with the light-sensitive grains. In black and white film, the
developer chemicals darken the grains that were exposed to light. This produces a
negative, where lighter areas appear darker and darker areas appear lighter, which is
then converted into a positive image in printing.
Color film has three different layers of light-sensitive materials, which
respond, in turn, to red, green and blue. When the film is developed, these layers are
exposed to chemicals that dye the layers of film. When you overlay the color
information from all three layers, you get a full-color negative.
16
For an in-depth description of this entire process, check out How Photographic
Film Works.
So far, we've looked at the basic idea of photography -- you create a real
image with a converging lens, and you record the light pattern of this real image on a
layer of light-sensitive material. Conceptually, this is all that's involved in taking a
picture. But to capture a clear image, you have to carefully control how everything
comes together.
Obviously, if you were to lay a piece of film on the ground and focus a real
image onto it with a converging lens, you wouldn't get any kind of usable picture. Out
in the open, every grain in the film would be completely exposed to light. And
without any contrasting unexposed areas, there's no picture.
To capture an image, you have to keep the film in complete darkness until it's
time to take the picture. Then, when you want to record an image, you let some light
in. At its most basic level, this is all the body of a camera is -- a sealed box with
a shutter that opens and closes between the lens and film. In fact, the term camera is
shortened from camera obscura, literally "dark room" in Latin.
For the picture to come out right, you have to precisely control how much
light hits the film. If you let too much light in, too many grains will react, and the
picture will appear washed out. If you don't let enough light hit the film, too few
grains will react, and the picture will be too dark. In the next section, we'll look at the
different camera mechanisms that let you adjust the exposure.
What's in a Name?
As it turns out, the term photography describes the photographic process quite
accurately. Sir John Herschel, a 19th century astronomer and one of the first
photographers, came up with the term in 1839. The term is a combination of two
Greek words -- photos meaning light and graphein meaning writing (or drawing). The
term camera comes from camera obscura, Latin for "dark room." The camera
obscura was actually invented hundreds of years before photography. A traditional
camera obscura was a dark room with light shining through a lens or tiny hole in the
wall. Light passed through the hole, forming an upside-down real image on the
opposite wall. This effect was very popular with artists, scientists and curious
spectators.
17
Cameras: The Right Light
The plates in the iris diaphragm fold in on each other to shrink the
aperture and expand out to make it wider.
In the last section, we saw that you need to carefully control the film's
exposure to light, or your picture will come out too dark or too bright. So how do you
adjust this exposure level? You have to consider two major factors:
 How much light is passing through the lens
 How long the film is exposed
To increase or decrease the amount of light passing through the lens, you have
to change the size of the aperture -- the lens opening. This is the job of the iris
diaphragm, a series of overlapping metal plates that can fold in on each other or
expand out. Essentially, this mechanism works the same way as the iris in your eye --
it opens or closes in a circle, to shrink or expand the diameter of the lens. When the
lens is smaller, it captures less light, and when it is larger, it captures more light.
The length of exposure is determined by the shutter speed. Most SLR
cameras use a focal plane shutter. This mechanism is very simple -- it basically
consists of two "curtains" between the lens and the film. Before you take a picture, the
first curtain is closed, so the film won't be exposed to light. When you take the
picture, this curtain slides open. After a certain amount of time, the second curtain
slides in from the other side, to stop the exposure.
When you click the camera's shutter release, the first curtain slides open,
exposing the film. After a certain amount of time, the second shutter slides
closed, ending the exposure. The time delay is controlled by the camera's shutter
speed knob.
This simple action is controlled by a complex mass of gears, switches and
springs, like you might find inside a watch. When you hit the shutter button, it
releases a lever, which sets several gears in motion. You can tighten or loosen some
of the springs by turning the shutter speed knob. This adjusts the gear mechanism,
increasing or decreasing the delay between the first curtain opening and the second
curtain closing. When you set the knob to a very slow shutter speed, the shutter is
open for a very long time. When you set the knob to a very high speed, the second
18
curtain follows directly behind the first curtain, so only a tiny slit of the film frame is
exposed at any one time.
The ideal exposure depends on the size of the light-sensitive grains in the film.
A larger grain is more likely to absorb light photons than a smaller grain. The size of
the grains is indicated by a film's speed, which is printed on the canister. Different
film speeds are suited to different types of photography -- 100 ISO film, for example,
is optimal for shots in bright sunlight, while 1600 film should only be used in
relatively low light.
Inside a manual
SLR camera, you'll
find an intricate puzzle
of gears and springs.
Click on each picture
for a high-resolution
close-up shot.
As you can see,
there's a lot involved in
getting the exposure
right -- you have to balance film speed, aperture size and shutter speed to fit the light
level in your shot. Manual SLR cameras have a built-in light meter to help you do
this. The main component of the light meter is a panel of semi-conductor light sensors
that are sensitive to light energy. These sensors express this light energy as electrical
energy, which the light meter system interprets based on the film and shutter speed.
Now, let's see how an SLR camera body directs the real image to the
viewfinder before you take the shot, and then directs it to the film when you press the
shutter button.
SLR Cameras vs. Point-and-Shoot
There are
two types of
consumer film
cameras on the
market -- SLR
cameras and
"point-and-shoot"
cameras. The
main difference is
how the
photographer sees
the scene. In a
point-and-shoot
camera, the
viewfinder is a
simple window
through the body of the camera. You don't see the real image formed by the camera
lens, but you get a rough idea of what is in view.
19
In an SLR camera, you see the actual real image that the film will see. If you
take the lens off of an SLR camera and look inside, you'll see how this works. The
camera has a slanted mirror positioned between the shutter and the lens, with a piece
of translucent glass and a prism positioned above it. This configuration works like a
periscope -- the real image bounces off the lower mirror on to the translucent glass,
which serves as a projection screen. The prism's job is to flip the image on the screen,
so it appears right side up again, and redirect it on to the viewfinder window.
When you click the shutter button, the camera quickly switches the mirror out
of the way, so the image is directed at the exposed film. The mirror is connected to
the shutter timer system, so it stays open as long as the shutter is open. This is why the
viewfinder is suddenly blacked out when you take a picture.
The
mirror in an
SLR camera
directs the
real image to
the
viewfinder.
When you hit
the shutter
button, the
mirror flips
up so the real image is projected onto the film.
In this sort of camera, the mirror and the translucent screen are set up so they
present the real image exactly as it will appear on the film. The advantage of this
design is that you can adjust the focus and compose the scene so you get exactly the
picture you want. For this reason, professional photographers typically use SLR
cameras.
These days, most SLR cameras are built with both manual and automatic
controls, and most point-and-shoot cameras are fully automatic. Conceptually,
automatic cameras are pretty much the same as fully manual models, but everything is
controlled by a central microprocessor instead of the user. The central microprocessor
receives information from the autofocus system and the light meter. Then it activates
several small motors, which adjust the lens and open and close the aperture. In
modern cameras, this a pretty advanced computer system.
Automatic point-and-shoot
camera use circuit boards and
electric motors, instead of gears
and springs.
In the next section, we'll
look at the other end of the
spectrum -- a camera design with no
20
complex machinery, no lens and barely any moving parts.
Throughout the history of photography, there have been hundreds of different
camera systems. But amazingly, all these designs -- from the simplest homemade box
camera to the newest digital camera -- combine the same basic elements: a lens
system to create the real image, a light-sensitive sensor to record the real image, and a
mechanical system to control how the real image is exposed to the sensor. And when
you get down to it, that's all there is to photography!
Types of Digital Image Sensors :
21
Working of Camera:
In the past twenty years, most of the major technological breakthroughs in consumer
electronics have really been part of one larger breakthrough. When you get down to
it, CDs, DVDs, HDTV, MP3s and DVRs are all built around the same basic process:
converting conventional analog information (represented by a fluctuating wave) into
digital information (represented by ones and zeros, or bits). This fundamental shift in
technology totally changed how we handle visual and audio information -- it
completely redefined what is possible.
The digital camera is one of the most remarkable instances of this shift
because it is so truly different from its predecessor.Conventional cameras depend
entirely on chemical and mechanical processes -- you don't even need electricity to
operate them. On the other hand, all digital cameras have a built-in computer, and all
of them record images electronically.
The new approach has been enormously successful. Since film still provides
better picture quality, digital cameras have not completely replaced conventional
cameras. But, as digital imaging technology has improved, digital cameras have
rapidly become more popular.
In this article, we'll find out exactly what's going on inside these amazing
digital-age devices.
Digital Camera Basics
Let's say you want to take a picture and e-mail it to a friend. To do this, you
need the image to be represented in the language that computers recognize -- bits and
bytes. Essentially, a digital image is just a long string of 1s and 0s that represent all
the tiny colored dots -- or pixels -- that collectively make up the image. (For
information on sampling and digital representations of data, see this explanation of
the digitization of sound waves. Digitizing light waves works in a similar way.)
If you want to get a picture into this form, you have two options:
 You can take a photograph using a conventional film camera, process
the film chemically, print it onto photographic paper and then use adigital
scanner to sample the print (record the pattern of light as a series of pixel
values).
 You can directly sample the original light that bounces off your subject,
immediately breaking that light pattern down into a series of pixel values -- in
other words, you can use a digital camera.
At its most basic level, this is all there is to a digital camera. Just like
a conventional camera, it has a series of lenses that focus light to create an image of a
scene. But instead of focusing this light onto a piece of film, it focuses it onto
a semiconductor device that records light electronically. A computer then breaks this
electronic information down into digital data. All the fun and interesting features of
digital cameras come as a direct result of this process.
In the next few sections, we'll find out exactly how the camera does all this.
22
Cool Facts
 With a 3-megapixel camera, you can take a higher-resolution picture than
most computer monitors can display.
 You can use your Web browser to view digital pictures taken using the JPEG
format.
 The first consumer-oriented digital cameras were sold by Kodak and Apple in
1994.
 In 1998, Sony inadvertently sold more than 700,000 camcorders with a limited
ability to see through clothes.
CCD and CMOS: Filmless Cameras
A CMOS image sensor
Instead of film, a digital camera has a sensor that converts light into electrical
charges.
The image sensor employed by most digital cameras is a charge coupled
device (CCD). Some cameras use complementary metal oxide
semiconductor (CMOS) technology instead. Both CCD and CMOS image sensors
convert light into electrons. If you've read How Solar Cells Work, you already
understand one of the pieces of technology used to perform the conversion. A
simplified way to think about these sensors is to think of a 2-D array of thousands or
millions of tiny solar cells.
Once the sensor converts the light into electrons, it reads the value
(accumulated charge) of each cell in the image. This is where the differences between
the two main sensor types kick in:
 A CCD transports the charge across the chip and reads it at one corner of the
array. An analog-to-digital converter (ADC)then turns each pixel's value
into a digital value by measuring the amount of charge at each photosite and
converting that measurement to binary form.
 CMOS devices use several transistors at each pixel to amplify and move the
charge using more traditional wires.
Differences between the two types of sensors lead to a number of pros and
cons:
23
A CCD sensor
PHOTO COURTESY DALSA
 CCD sensors create high-quality, low-noise images. CMOS sensors are
generally more susceptible to noise.
 Because each pixel on a CMOS sensor has several transistors located next to
it, the light sensitivity of a CMOS chip is lower. Many of the photons hit the
transistors instead of the photodiode.
 CMOS sensors traditionally consume little power. CCDs, on the other hand,
use a process that consumes lots of power. CCDs consume as much as 100
times more power than an equivalent CMOS sensor.
 CCD sensors have been mass produced for a longer period of time, so they are
more mature. They tend to have higher quality pixels, and more of them.
Although numerous differences exist between the two sensors, they both play
the same role in the camera -- they turn light into electricity. For the purpose of
understanding how a digital camera works, you can think of them as nearly identical
devices.
Digital Camera Resolution
The size of an image taken at different resolutions
PHOTO COURTESY MORGUEFILE
24
The amount of detail that the camera can capture is called the resolution, and
it is measured in pixels. The more pixels a camera has, the more detail it can capture
and the larger pictures can be without becoming blurry or "grainy."
Some typical resolutions include:
 256x256 - Found on very cheap cameras, this resolution is so low that the
picture quality is almost always unacceptable. This is 65,000 total pixels.
 640x480 - This is the low end on most "real" cameras. This resolution is ideal
for e-mailing pictures or posting pictures on a Web site.
 1216x912 - This is a "megapixel" image size -- 1,109,000 total pixels -- good
for printing pictures.
 1600x1200 - With almost 2 million total pixels, this is "high resolution." You
can print a 4x5 inch print taken at this resolution with the same quality that
you would get from a photo lab.
 2240x1680 - Found on 4 megapixel cameras -- the current standard -- this
allows even larger printed photos, with good quality for prints up to 16x20
inches.
 4064x2704 - A top-of-the-line digital camera with 11.1 megapixels takes
pictures at this resolution. At this setting, you can create 13.5x9 inch prints
with no loss of picture quality.
High-end consumer cameras can capture over 12 million pixels. Some
professional cameras support over 16 million pixels, or 20 million pixels for large-
format cameras. For comparison, Hewlett Packard estimates that the quality of 35mm
film is about 20 million pixels [ref].
Next, we'll look at how the camera adds color to these images.
How Many Pixels?
You may have noticed that the number of pixels and the maximum resolution
don't quite compute. For example, a 2.1-megapixel camera can produce images with a
resolution of 1600x1200, or 1,920,000 pixels. But "2.1 megapixel" means there
should be at least 2,100,000 pixels.
This isn't an error from rounding off or binary mathematical trickery. There is
a real discrepancy between these numbers because the CCD has to include circuitry
for the ADC to measure the charge. This circuitry is dyed black so that it doesn't
absorb light and distort the image.
25
Capturing Color
How the original (left) image is split in a beam splitter
Unfortunately, each photosite is colorblind. It only keeps track of the total
intensity of the light that strikes its surface. In order to get a full color image, most
sensors use filtering to look at the light in its three primary colors. Once the camera
records all three colors, it combines them to create the full spectrum.
There are several ways of recording the three colors in a digital camera. The
highest quality cameras use three separate sensors, each with a different filter.
A beam splitter directs light to the different sensors. Think of the light entering the
camera as water flowing through a pipe. Using a beam splitter would be like dividing
an identical amount of water into three different pipes. Each sensor gets an identical
look at the image; but because of the filters, each sensor only responds to one of the
primary colors.
The advantage of this method is that the camera records each of the three
colors at each pixel location. Unfortunately, cameras that use this method tend to be
bulky and expensive.
Another method is to rotate a series of red, blue and green filters in front of a
single sensor. The sensor records three separate images in rapid succession. This
method also provides information on all three colors at each pixel location; but since
the three images aren't taken at precisely the same moment, both the camera and the
target of the photo must remain stationary for all three readings. This isn't practical for
candid photography or handheld cameras.
Both of these methods work well for professional studio cameras, but they're
not necessarily practical for casual snapshots. Next, we'll look at filtering methods
that are more suited to small, efficient cameras.
26
Demosaicing Algorithms: Color Filtering
A more economical and
practical way to record the primary
colors is to permanently place a
filter called a color filter array over
each individual photosite. By
breaking up the sensor into a variety
of red, blue and green pixels, it is
possible to get enough information
in the general vicinity of each sensor
to make very accurate guesses about
the true color at that location. This
process of looking at the other pixels
in the neighborhood of a sensor and
making an educated guess is called
interpolation.
The most common pattern of filters is the Bayer filter pattern. This pattern
alternates a row of red and green filters with a row of blue and green filters. The
pixels are not evenly divided -- there are as many green pixels as there are blue and
red combined. This is because the human eye is not equally sensitive to all three
colors. It's necessary to include more information from the green pixels in order to
create an image that the eye will perceive as a "true color."
The advantages of this method are that only one sensor is required, and all the
color information (red, green and blue) is recorded at the same moment. That means
the camera can be smaller, cheaper, and useful in a wider variety of situations. The
raw output from a sensor with a Bayer filter is a mosaic of red, green and blue pixels
of different intensity.
Digital cameras use specialized demosaicing algorithms to convert this
mosaic into an equally sized mosaic of true colors. The key is that each colored pixel
can be used more than once. The true color of a single pixel can be determined by
averaging the values from the closest surrounding pixels.
Some single-sensor cameras use alternatives to the Bayer filter pattern. X3
technology, for example, embeds red, green and blue photodetectors in silicon. Some
of the more advanced cameras subtract values using the typesetting colors cyan,
yellow, green and magenta instead of blending red, green and blue. There is even a
method that uses two sensors. However, most consumer cameras on the market today
use a single sensor with alternating rows of green/red and green/blue filters.
Digital Camera Exposure and Focus
Just as with film, a digital camera has to control the amount of light that
reaches the sensor. The two components it uses to do this, the aperture and shutter
speed, are also present on conventional cameras.
 Aperture: The size of the opening in the camera. The aperture is automatic in
most digital cameras, but some allow manual adjustment to give professionals
and hobbyists more control over the final image.
27
 Shutter speed: The amount of time that light can pass through the aperture.
Unlike film, the light sensor in a digital camera can be reset electronically, so
digital cameras have a digital shutter rather than a mechanical shutter.
These two aspects work together to capture the amount of light needed to
make a good image. In photographic terms, they set the exposure of the sensor. You
can learn more about a camera's aperture and shutter speed in How Cameras Work.
In addition to controlling the amount of light, the camera has to adjust the
lenses to control how the light is focused on the sensor. In general, the lenses on
digital cameras are very similar to conventional camera lenses -- some digital cameras
can even use conventional lenses. Most use automatic focusing techniques, which you
can learn more about in the article How Autofocus Cameras Work.
The focal length, however, is one important difference between the lens of a
digital camera and the lens of a 35mm camera. The focal length is the distance
between the lens and the surface of the sensor. Sensors from different manufacturers
vary widely in size, but in general they're smaller than a piece of 35mm film. In order
to project the image onto a smaller sensor, the focal length is shortened by the same
proportion. For additional information on sensor sizes and comparisons to 35mm film,
you can visit the Photo.net Web site.
Focal length also determines the magnification, or zoom, when you look
through the camera. In 35mm cameras, a 50mm lens gives a natural view of the
subject. Increasing the focal length increases the magnification, and objects appear to
get closer. The reverse happens when decreasing the focal length. A zoom lens is any
lens that has an adjustable focal length, and digital cameras can
have optical or digital zoom -- some have both. Some cameras also havemacro
focusing capability, meaning that the camera can take pictures from very close to the
subject.
Digital cameras have one of four types of lenses:
 Fixed-focus, fixed-zoom lenses - These are the kinds of lenses on disposable
and inexpensive film cameras -- inexpensive and great for snapshots, but fairly
limited.
 Optical-zoom lenses with automatic focus - Similar to the lens on a video
camcorder, these have "wide" and "telephoto" options and automatic focus.
The camera may or may not support manual focus. These actually change the
focal length of the lens rather than just magnifying the information that hits
the sensor.
 Digital zoom - With digital zoom, the camera takes pixels from the center of
the image sensor and interpolates them to make a full-sized image.
Depending on the resolution of the image and the sensor, this approach may
create a grainy or fuzzy image. You can manually do the same thing with
image processing software -- simply snap a picture, cut out the center and
magnify it.
 Replaceable lens systems - These are similar to the replaceable lenses on a
35mm camera. Some digital cameras can use 35mm camera lenses.
28
Next, we'll learn about how the camera stores pictures and transfers them to a
computer.
Storing Digital Photos
A CompactFlash card
Most digital cameras have an LCD
screen, so you can view your picture right
away. This is one of the great advantages of a
digital camera -- you get immediate feedback
on what you capture. Of course, viewing the
image on your camera would lose its charm if
that's all you could do. You want to be able to load the picture into your computer or
send it directly to a printer. There are several ways to do this.
Early generations of digital cameras had fixed storage inside the camera. You
needed to connect the camera directly to a computer with cables to transfer the
images. Although most of today's cameras are capable of connecting
through serial,parallel, SCSI, USB or FireWire connections, they usually also use
some sort of removable storage device.
Digital cameras use a number of storage systems. These are like reusable,
digital film, and they use a caddy or card reader to transfer the data to a computer.
Many involve fixed or removable flash memory. Digital camera manufacturers often
develop their own proprietary flash memory devices,
including SmartMedia cards, CompactFlash cards and Memory Sticks. Some other
removable storage devices include:
 Floppy disks
 Hard disks, or microdrives
 Writeable CDs and DVDs
No matter what type of storage they use, all digital cameras need lots of room
for pictures. They usually store images in one of two formats -- TIFF, which is
uncompressed, and JPEG, which is compressed, but some use RAW format. Most
cameras use the JPEG file format for storing pictures, and they sometimes offer
quality settings (such as medium or high). The following information will give you an
idea of the file sizes you might expect with different picture sizes.
640x480
 TIFF (uncompressed) 1.0 MB
 JPEG (high quality) 300 KB
 JPEG (medium quality) 90 KB
800x600
 TIFF (uncompressed) 1.5 MB
29
 JPEG (high quality) 500 KB
 JPEG (medium quality) 130 KB
1024x768
 TIFF (uncompressed) 2.5 MB
 JPEG (high quality) 800 KB
 JPEG (medium quality) 200 KB
1600x1200
 TIFF (uncompressed) 6.0 MB
 JPEG (high quality) 1.7 MB
 JPEG (medium quality) 420 KB
To make the most of their storage space, almost all digital cameras use some
sort of data compression to make the files smaller. Two features of digital images
make compression possible. One is repetition. The other is irrelevancy.
Imagine that throughout a given photo, certain patterns develop in the colors.
For example, if a blue sky takes up 30 percent of the photograph, you can be certain
that some shades of blue are going to be repeated over and over again. When
compression routines take advantage of patterns that repeat, there is no loss of
information and the image can be reconstructed exactly as it was recorded.
Unfortunately, this doesn't reduce files any more than 50 percent, and sometimes it
doesn't even come close to that level.
Irrelevancy is a trickier issue. A digital camera records more information than
the human eye can easily detect. Some compression routines take advantage of this
fact to throw away some of the more meaningless data.
Next, we'll tie it all together and see how a digital camera takes a picture.
CCD Camera Summary
It takes several steps for a digital camera to take a picture. Here's a review of
what happens in a CCD camera, from beginning to end:
 You aim the camera at the subject and adjust the optical zoom to get closer or
farther away.
 You press lightly on the shutter release.
 The camera automatically focuses on the subject and takes a reading of the
available light.
 The camera sets the aperture and shutter speed for optimal exposure.
 You press the shutter release all the way.
 The camera resets the CCD and exposes it to the light, building up an
electrical charge, until the shutter closes.
30
 The ADC measures the charge and creates a digital signal that represents the
values of the charge at each pixel.
 A processor interpolates the data from the different pixels to create natural
color. On many cameras, it is possible to see the output on the LCD at this
stage.
 A processor may perform a preset level of compression on the data.
 The information is stored in some form of memory device (probably a Flash
memory card).
A CCD Image Sensor:
Advent of CMOS technology in eighties led to the phenomenal growth in
semiconductor industry. Transistors have become smaller, faster, consume less power,
and are cheaper to manufacture. It is CMOS technology which has enabled very high
integration on the chips leading to modern high performance, miniaturized integrated
circuits.
Apart from the valuable contribution in miniaturization of integrated circuits, CMOS
technology found applications in sensing applications.
CMOS technology has been adopted to design sensors, especially in the field of
imaging. Due to the wide usage of CMOS based image sensors, CMOS sensors are
31
often considered to be a synonym of CMOS based image sensors and have emerged
as a competitor to CCD based image sensors.
Until recently, Charge Coupled Devices (CCDs) dominated most of the image sensing
systems, i.e., cameras, camcorders, etc. CCDs have been in use in astronomical
cameras, video camcorders and scanners. However of late, CMOS Imaging have
emerged as an alternative to CCD imagers and it also offers better features.
Subsequent sections will discuss both CCD and CMOS sensor based imagers, their
pros and cons, and also their applications. Further, other applications
of CMOS technology in the field of sensing will be discussed.
32
CMOS Vs CCD
Invention of CCD marked the end of vacuum tube imagers used in television cameras
as it overcame the disadvantages of vacuum tubes like chronic picture artifacts as lag
and burn-in, fragility of large glass tubes or the sensitivity to shock, vibration and
electromagnetic radiation, painstaking periodic alignment of tubes, etc. It also marked
the beginning of a new era in imaging systems and for decades, it enjoyed quality
advantages over the rival CMOS sensors. Wherever image quality was paramount,
CCDs were preferred, CMOS were used mainly in applications where small size and
low power were prime requirements.
With the technological development in CMOS technology, gap between CCD
and CMOS sensors has narrowed; CMOS sensors can also achieve competitive
quality. Choice amongst CCD and CMOS sensors has become increasingly difficult.
Both CCD and CMOS image sensors use large arrays of thousands (sometimes
millions) of photo-sites, commonly called pixels. Both carry out same steps.
1. Light-to-charge conversion
Incident light is directed by the microlens (a tiny lens placed over the pixel to increase
its effective size and thereby fill factor) onto the photo-sensitive area of each pixel
where it is converted into electrons that collect in a semiconductor "bucket."
The bigger the pixel, the more light it can collect. Thus, big pixel sensors work best
under low-light conditions. For the same number of pixels, bigger pixels results in
bigger chip, this means higher cost. Conversely, smaller pixels enable smaller chip
sizes and lower chip prices, as well as lower lens costs. But there are limitations on
33
pixel size reduction. Smaller pixels are less sensitive to light, the optics required to
resolve the pixels becomes expensive and requires expensive fabrication possesses.
2. Charge accumulation
As more light enters, more electrons accumulate into the bucket.
3. Transfer
Accumulated charge must be transferred to the signal conditioning and processing
circuitry.
4. Charge-to-voltage conversion
The accumulated charge must be output as the voltage signal.
5. Amplification
Voltage signal is then amplified before it is fed to the camera circuitry.Both CMOS
and CCD perform all these tasks; however the aspect in which they differ is the order
of execution of these tasks.
BRIEF ON CCD TECHNOLOGY
CCDs were first invented in 1969 as a way to store data using bubble memory. In
1974, the first imaging CCD was produced by Fairchild Electronics with a format of
100x100 pixels.
CCD imager consists of two main parts: color filter and pixel array
• Color filter
Micro-lenses funnel light onto the photo-sensitive part of each pixel. On their way,
the photons pass through a color filter array. The mosaic of these tiny filters captures
color information. Color filters enable separate measurement of the red (R), green (G)
and blue (B) photons. Color filter filters out wavelengths of unwanted colors and
allows only specific colors of light to pass through a pixel sensor. For this purpose,
each pixel is covered with a red, green and a blue filter according to a specific pattern,
like the Bayer CFA pattern.
Bayer filter uses the sub-
mosaic 2x 2 patterns with one
red, one blue and two green
filters. As human’s eye has
greater sensitivity for green
light, two green filters are
used.
34
• Pixel Array
The pixel array functions on the principle of the photoelectric effect and pixel sensors
are responsible for capturing the intensity of the light passing through. The light
intensity data is combined before being converted into an analog voltage signal,
which is outputted to an external circuit board to be further processed.
After conversion of incident light into electrons, electron charge is accumulated in the
same way as bucket stores water. The pixel charges are read using vertical and
horizontal shift registers which act as charge carriers.
CMOS SENSORS
A typical CMOS is an integrated circuit with an array of pixel sensors. In contrast to
CCD, each pixel sensor in CMOS sensors contains its own light sensor, an amplifier
and a pixel select switch. An analog-to-digital converter and other components critical
to the operation of the pixel sensors are located on the CMOS sensor.
The CMOS sensor contains four main parts: the color filters, the pixel array, the
digital controller, and the analog to digital convertor.
• Color Filter
Color filter is the same as was described in CCD based imager.
• Pixel Array
As in the case of CCD, function of the pixel array is to capture the intensity of the
light passing through. Each pixel sensor converts the sensitivity of the incoming light
to the voltage signal which is then fed to ADC for further processing
There are two types of architectures of Pixel sensors: Passive Pixel Sensor (PPS) &
Active Pixel Sensors (APS).
In Passive Pixel sensors, only one photo-detector (without any local
amplifier) per pixel is used, whereas in Active Pixel sensors, 3-4 transistors per pixel
are used. Passive Pixel sensors have smaller pixels and large fill
35
factor but they are slow and have low SNR. On the other hand, active pixel sensors
are fast, have good SNRs but larger pixels and low fill factor.
However, due to advancement of CMOS technology down to nm, pixel size/fill factor
is no longer a big issue and APS is the technology which is preferred and used in most
devices.
• ADC
The ADC takes the analog voltage signals from the pixel sensor array and converts
them into a digital signal.
• Digital Controller
The digital controller governs the functioning of the the CMOS sensor; it controls the
pixel array, ensures synchronism between all pixels, etc.
Operation of CMOS Sensors
a) Pixel sensor acts like a charge bucket; accumulates electron charges the same
way as water bucket stores water
b) Charge is converted to voltage & amplified at the pixel.
c) Individual CMOS microwire carry voltage from one pixel at a time, controlled
by the pixel select switch
d) To output video signal, following steps are followed
1. All pixel select switches are turned ON. This outputs voltage of each pixel to
column circuit.
2. Column select switches are turned ON from left to right. In this way, signal
voltages of each pixel in the same row are output in order.
3. This is repeated for all rows from the top to the bottom in order, signal
voltages of all pixels can be output from the top-left corner to the bottom-right corner
of the image sensor.
e) These signal voltages are output to the signal processor of the camera.
36
CMOS SENSOR TYPES
Difference between types of CMOS sensors is generally due to the number of
transistors (affecting fill factor) that are present for each pixel. A portion of the pixel
sensor that is actually sensitive to light is called fill factor.
a) Rolling Shutter type
This has got limited number of transistors and therefore has a high fill factor.
However, lines of pixels are exposed at different times and therefore, movement in
the target gives a distorted image.
b) Global Shutter type
The number of transistors is high in this case resulting in a low fill factor. But, all the
pixels are exposed at a time and thus the movement artifacts associated with rolling
shutter type sensors are removed.
CCD AND CMOS SENSORS: PROS AND CONS
1. Fabrication Process
CCD sensors use specialized fabrication that uses dedicated and costly manufacturing
processes, whereas CMOS sensors rely on standard CMOS technology (used for IC
fabrication like microprocessors, memory, etc.). As CMOS sensors can also integrate
required electronics on the same chip, CMOS sensors results in compact and cost
effective system
2. Dynamic Range
Dynamic range of CCD is roughly twice as that of CMOS sensor. This implies that if
better colour depth is required, CCDs are likely to offer better results. On the other
hand, CMOS are marginally more photosensitive.
37
3. Power Consumption
CMOS cameras have lower power consumption than CCDs but other CMOS circuitry
may require more power. Low end CMOS sensors have low power requirements, but
high speed CMOS cameras typically require more power than CCDs.
4. Noise
Two types of noise affect sensors’ performance: Temporal Noise and Fixed pattern
noise. Fixed pattern Noise is more in CMOS, compared to CCDs because charge is
converted to voltage at each pixel as compared to single point charge-voltage
conversion in CCDs. In terms of temporal noise, CMOS sensors are better as the
bandwidth of amplifiers at each pixel is lower than the output amplifier in case of
CCD.
5. Image Quality
Due to poor fill factor of CMOS, photosensitivity of CMOS sensors is poor in low
light conditions.
6. Uniformity of response
CCDs use single amplifier for all pixels and CMOS use separate amplifiers for each
pixel. Pixel-to-pixel amplification differences lead to non-uniformity. Response of
CCDs is pretty uniform.
7. Speed
CMOS sensors have higher speed due to the fact that it uses active pixels and ADCs
on same chip leading to lesser propagation delays.
8. Readout area
CMOS sensors allow any region or even multiple regions to be read off the sensor.
CCDs are limited by vertical scan read out
9. Smart functions
With the integration of signal processing circuitry on the CMOS sensor chip,
functions like auto gain control, auto exposure control etc., anti-jitter, image
compression, color encoding, motion tracking, etc. can be incorporated on-chip.
38
10. Overexposure effect
Overexposure can cause smearing around over-exposed pixels. Smearing is caused by
spilling of charge into the shift register. Due to absence of shift registers in CMOS
sensors, they are immune to this effect.






A Cmos Image Sensor :
39
Film vs. Digital: A Comparison of the Advantages and
Disadvantages
In a world in which photographs are primarily taken with digital image sensors, there
are a growing number of photographers who are newly interested in film formats of
the past. But why would anyone in our age of technological convenience still choose
to shoot with analog film?
To understand the advantages and disadvantages of each shooting practice, we are
comparing the different aspects of each’s image quality, along with the cost of usage.
If you have been thinking of tinkering with film photography, you have landed in the
right place.
Resolution
When it comes to both digital and analog formats, photographers want to
know that their efforts will result in sharp, high-resolution photographs. With digital
40
image sensors, we determine resolution by counting the number of pixels within a
given area. Film does not have pixels, and thus an analysis of a film’s resolving power
is calculated through angular resolution. Both methods of measurement can be
correlated with each other and thus compared for equivalent resolution.
Just as different sensors produce different resolutions, different types of film
will also produce different resolutions. Roger N. Clark’s analysis of standard 35mm
film showcased that depending on the type of film used, the resolution fell between 4
and 16 million pixels. For example, Clark’s study noted that Fujifilm’s Provia 100
film produced a resolution around 7 MP while Fujifilm’s Velvia 50 produced a
resolution around 16 MP. Considering that entry cameras such as Nikon’s D3330
produce around 24 MP, 35mm film doesn’t have much of an advantage in this
scenario.
That being said, many professional photographers who shoot film opt to do so with
medium or large formats. According to research carried out by a team of four industry
experts, it was found that medium format film has a potential to capture a jaw-
dropping 400 MP photograph, however, after digital scanning, resulted in a resolution
of 50 to 80 MP. Another test, also conducted by Roger N. Clark, noted that larger
formats such as 4×5 inches can capture 200 MP equivalent photographs after being
scanned.
In short, that 35mm film camera that you picked up from the flea market may
not be able to outperform the latest digital cameras, but a medium format or large
format unit can deliver and exceed the same resolution of Phase One’s latest $40,000
camera system.
41
Digital Noise / Film Grain :
The random appearance of small textures within a photograph may be referred
to as digital noise or film grain. With analog film, grain is the result of small chemical
particles that have not received enough light. Within digital image sensors, noise is
the result of unwanted signals created by the camera’s digital circuitry; this can be due
to excess heat or a sensor’s ability to handle unruly signals in the airwaves.
Increasing the ISO of a digital camera or selecting high-speed film will make
your photographs more susceptible to noise and grain. In most situations, noise is
unwanted in color photos; however, with black and white images, some artists view
the grain as adding character, and thus not as a negative point.
Testing by Magnetic Recording Technology Expert, Norman
Koren, showcased that digital photography has evolved to the point at which it has far
less noise than the equivalent available film speed. Of course, digital noise depends
on the sensor within a digital camera, so older units may not be as efficient.
42
One last item to consider with noise/grain is that film may be a better medium
for capturing long exposure photographs. Image sensors must be operated at low
temperatures to avoid thermal noise, a process that can become difficult with
prolonged usage of the imaging circuitry. Film, on the other hand, does not have any
issues with overheating.
Dynamic Range
Once the almighty reason to shoot with analog film over digital, dynamic
range is no longer the huge debate it once was in the past. While the dynamic range of
an Image is a complex process that takes into account the sensor used, the type of file
compression, and other factors, digital is ultimately winning against analog film.
43
A release by Kodak showcased that most film has around 13 stops of dynamic
range. Today’s modern digital cameras all average around 14 stops of dynamic range,
with high-end units such as the Nikon D810 reaching almost 15 stops. Film
continuous to deliver incredible dynamic range, but today’s digital technology can
easy match it.
Independent testing of dynamic range on film cameras, such as the tests conducted by
Roger N. Clark, showed that high-end digital cameras in 2005 began to show “huge
dynamic range compared to [scans of] either print or slide film”. Films used in the
testing included Kodak Gold 200 and Fujifilm FujiChrome Velvia.
In additional, many digital cameras take advantage of sequential shots and
HDR capabilities to create photographs with exceptional high dynamic range beyond
what is capable with film.
Film Speed
When it comes to shooting in low light conditions, digital image sensors easily
take the cake. Film can usually be found available in speeds between 100 and 3200,
although 6400 film does exist. Today’s digital camera systems can match the noise
produced by analog cameras in these ranges, as well as push their sensitivity many
stops higher. Consumer digital cameras such as Fujifilm’s X100T can simulate
sensitivities as high as ISO 51200 while professional Nikon systems, such as the D4s,
can shoot as high as ISO
409,600.
Digital cameras also
have the advantage of being
able to change film speeds
between individual
photographs. For most
common roll films used
today (135, 120, etc.), the
44
ISO is kept constant across the entire roll. The exception is with large format cameras
that use one sheet at a time, and thus can be switched between shots.
Analog film can be pushed or pulled multiple stops when needed, but the
amount of contrast within the image is affected. Some photographers use this to their
advantage to create the ideal look they desire, but this method still does not allow
extremely high ISO speeds without impacting image tones.
Cost and Convenience
When it comes to cost and convenience, both digital and analog formats have
their advantages and disadvantages. Noting the number of photographs you take
within a given time, the urgency of needing an image available, and the type of
subjects you shoot, will help you choose between the two options.
Digital has a much more expensive up front cost and evolving technology
means you will most likely want to upgrade your equipment within a few years. For
those who demand instant access to their photographs, there is nothing faster and
more convenient than digital. When shooting high-speed action photography, there is
also no concern about running out of film; large memory cards can easilystore
hundreds or thousands of high-resolution photographs.
Analog is much more affordable up front, and you will most likely be able to
use your film body for decades to come, as genuine enhancements are to the film
itself. That being said, analog shooters will be spending a lot more money on film
rolls and development costs. There is the need to conserve film more carefully as
nothing can be just deleted as with digital and photos are not availably instantly. Most
available processing labs take at least 24 hours, if not a few days, to complete the
process. Sadly, one hour photography stores are a dying breed.
45
Let’s say that you want a modern digital camera with resolution, dynamic
range, and grain equivalent to ISO 100 film. You may choose to pick up a Nikon
D3300 – an entry camera that checks off all the boxes. The initial purchase may cost
$500, but with a cheap memory card ($30) you can shoot unlimited photographs and
delete what you don’t need. You may then opt to upgrade your camera within a five-
year span for another $500.
If you were to pick up a decent film camera for $150 and then shoot 100 photographs
a month for a year, your total film costs would be around $260 (using Kodak Ektar
100 Pro) and your development costs would be around $370. Over a five-year span,
you may not want to upgrade your camera, but total development costs and film
would still amount around $3,200.
46
Conclusion:
The digital revolution has caught up to film in many regards, killing many of
the arguments for film being better than its technological counterpart. However, the
most notable reason to shoot analog may be the resolution obtained from medium
format cameras. Not all explanations can be laid within technical comparisons though.
Many will argue that shooting analog is a more personal and enjoyable experience –
that decision, is completely up to you.
The inclusion of cameras in everything from cell phones to pens to children’s’
toys is possible because of the low cost and low power consumption of the imaging
arrays that form the core of the cameras. However, these arrays are low cost and low
power because they are CMOS-based; this allows for the devices to be made with the
same processes and facilities that are used to make memory and computer chips. Yet,
the continued surge in CMOS imager popularity goes beyond the lower cost to other
factors such as ability to integrate the sensors with electronics, and the ability to
achieve fast, customizable frame rates.
People have been using camera and film for more than 100 years, both for still
photography and movies. There is something magical about the process -- humans are
visual creatures, and a picture really does paint a thousand words for us!
47
References :
1.Tom Harris-How Camera Works http://electronics.howstuffworks.com/camera.htm
2.Redditt Photo Class -http://www.r-photoclass.com/
3.Image Acquisition- Springer
4. -
http://electronics.howstuffworks.com/cameras-photography/digital/digital-
camera.htm
5.Image Sensors-http://www.engineersgarage.com/articles/what-is-cmos-sensor
6.CCDvsCMOS-http://electronics.howstuffworks.com/cameras-
photography/digital/question362.htm/printable
7.Film vs Digital-http://petapixel.com/2015/05/26/film-vs-digital-a-comparison-of-
the-advantages-and-disadvantages/
8.E.R Fossum,"CMOS Image Sensors: Electronic Camera on a Chip,"IEDM,pp.1.3.1-
1.3.9,Dec.1995.
9.B. Ackland and A. Dickinson, “Camera on a Chip”, in ISSCC Dig.,Feb. 1996, pp.
22-25.
10.M. A. Schuster and G. Strull, "A monolithic mosaic of photon sensors for solid
state
imaging applications," in Electron Devices Meeting, 1965 International, 1965, pp. 20-
21.
11.R. Melen, "The tradeoffs in monolithic image sensors; MOS vs CCD," Electronics,
vol. 46,pp. 106-11, 1973.

More Related Content

What's hot

Chapter10 image segmentation
Chapter10 image segmentationChapter10 image segmentation
Chapter10 image segmentationasodariyabhavesh
 
Image Acquisition and Representation
Image Acquisition and RepresentationImage Acquisition and Representation
Image Acquisition and RepresentationAmnaakhaan
 
Spatial Filters (Digital Image Processing)
Spatial Filters (Digital Image Processing)Spatial Filters (Digital Image Processing)
Spatial Filters (Digital Image Processing)Kalyan Acharjya
 
filters for noise in image processing
filters for noise in image processingfilters for noise in image processing
filters for noise in image processingSardar Alam
 
Introduction to image contrast and enhancement method
Introduction to image contrast and enhancement methodIntroduction to image contrast and enhancement method
Introduction to image contrast and enhancement methodAbhishekvb
 
Digital Image Processing (DIP)
Digital Image Processing (DIP)Digital Image Processing (DIP)
Digital Image Processing (DIP)Srikanth VNV
 
Digital image processing
Digital image processingDigital image processing
Digital image processingmanpreetgrewal
 
Image Filtering in the Frequency Domain
Image Filtering in the Frequency DomainImage Filtering in the Frequency Domain
Image Filtering in the Frequency DomainAmnaakhaan
 
Digital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsDigital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsMostafa G. M. Mostafa
 
Introduction to Image Compression
Introduction to Image CompressionIntroduction to Image Compression
Introduction to Image CompressionKalyan Acharjya
 
SPATIAL FILTERING IN IMAGE PROCESSING
SPATIAL FILTERING IN IMAGE PROCESSINGSPATIAL FILTERING IN IMAGE PROCESSING
SPATIAL FILTERING IN IMAGE PROCESSINGmuthu181188
 
Image enhancement
Image enhancementImage enhancement
Image enhancementAyaelshiwi
 
Digital image processing
Digital image processingDigital image processing
Digital image processingChetan Hulsure
 
introduction to Digital Image Processing
introduction to Digital Image Processingintroduction to Digital Image Processing
introduction to Digital Image Processingnikesh gadare
 
Image pre processing
Image pre processingImage pre processing
Image pre processingAshish Kumar
 
Image enhancement techniques
Image enhancement techniquesImage enhancement techniques
Image enhancement techniquesSaideep
 

What's hot (20)

Chapter10 image segmentation
Chapter10 image segmentationChapter10 image segmentation
Chapter10 image segmentation
 
Digital Image Processing
Digital Image ProcessingDigital Image Processing
Digital Image Processing
 
Image Acquisition and Representation
Image Acquisition and RepresentationImage Acquisition and Representation
Image Acquisition and Representation
 
Spatial Filters (Digital Image Processing)
Spatial Filters (Digital Image Processing)Spatial Filters (Digital Image Processing)
Spatial Filters (Digital Image Processing)
 
filters for noise in image processing
filters for noise in image processingfilters for noise in image processing
filters for noise in image processing
 
Introduction to image contrast and enhancement method
Introduction to image contrast and enhancement methodIntroduction to image contrast and enhancement method
Introduction to image contrast and enhancement method
 
Digital Image Processing (DIP)
Digital Image Processing (DIP)Digital Image Processing (DIP)
Digital Image Processing (DIP)
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Image Filtering in the Frequency Domain
Image Filtering in the Frequency DomainImage Filtering in the Frequency Domain
Image Filtering in the Frequency Domain
 
Digital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsDigital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image Fundamentals
 
Histogram processing
Histogram processingHistogram processing
Histogram processing
 
Image processing ppt
Image processing pptImage processing ppt
Image processing ppt
 
Introduction to Image Compression
Introduction to Image CompressionIntroduction to Image Compression
Introduction to Image Compression
 
SPATIAL FILTERING IN IMAGE PROCESSING
SPATIAL FILTERING IN IMAGE PROCESSINGSPATIAL FILTERING IN IMAGE PROCESSING
SPATIAL FILTERING IN IMAGE PROCESSING
 
Image enhancement
Image enhancementImage enhancement
Image enhancement
 
Digital image processing
Digital image processingDigital image processing
Digital image processing
 
Sharpening spatial filters
Sharpening spatial filtersSharpening spatial filters
Sharpening spatial filters
 
introduction to Digital Image Processing
introduction to Digital Image Processingintroduction to Digital Image Processing
introduction to Digital Image Processing
 
Image pre processing
Image pre processingImage pre processing
Image pre processing
 
Image enhancement techniques
Image enhancement techniquesImage enhancement techniques
Image enhancement techniques
 

Similar to Image Sensing and Aquisition

camera,types,working and functionality
camera,types,working and functionalitycamera,types,working and functionality
camera,types,working and functionalityRahat Malik
 
Lesson in art part 2 Q1 and Q2 by S. will
Lesson in art part 2 Q1 and Q2 by S. willLesson in art part 2 Q1 and Q2 by S. will
Lesson in art part 2 Q1 and Q2 by S. willYamwill
 
3 4 cameras
3 4 cameras3 4 cameras
3 4 camerasRbk Asr
 
8 k extremely high resolution camera system
8 k extremely high resolution camera system8 k extremely high resolution camera system
8 k extremely high resolution camera systemPrejith Pavanan
 
My Extended Project Qualification (EPQ)
My Extended Project Qualification (EPQ)My Extended Project Qualification (EPQ)
My Extended Project Qualification (EPQ)derhamo
 
Kodak filmmakers (Cine Calidad)
Kodak filmmakers (Cine Calidad)Kodak filmmakers (Cine Calidad)
Kodak filmmakers (Cine Calidad)Cine Calidad
 
camera
cameracamera
cameraarpch
 
What i wish everyone knew about
What i wish everyone knew aboutWhat i wish everyone knew about
What i wish everyone knew aboutRashed9410
 
History of Film Technology GCSE Film Studies.
History of Film Technology GCSE Film Studies. History of Film Technology GCSE Film Studies.
History of Film Technology GCSE Film Studies. Ian Moreno-Melgar
 
Ana maria arevalo 6 a tecnologia
Ana maria arevalo 6 a tecnologiaAna maria arevalo 6 a tecnologia
Ana maria arevalo 6 a tecnologiaanimaria98
 

Similar to Image Sensing and Aquisition (20)

Камера
КамераКамера
Камера
 
camera,types,working and functionality
camera,types,working and functionalitycamera,types,working and functionality
camera,types,working and functionality
 
Lesson in art part 2 Q1 and Q2 by S. will
Lesson in art part 2 Q1 and Q2 by S. willLesson in art part 2 Q1 and Q2 by S. will
Lesson in art part 2 Q1 and Q2 by S. will
 
3 4 cameras
3 4 cameras3 4 cameras
3 4 cameras
 
Presentation on camera
Presentation on cameraPresentation on camera
Presentation on camera
 
8 k extremely high resolution camera system
8 k extremely high resolution camera system8 k extremely high resolution camera system
8 k extremely high resolution camera system
 
Physics the camera
Physics the cameraPhysics the camera
Physics the camera
 
My Extended Project Qualification (EPQ)
My Extended Project Qualification (EPQ)My Extended Project Qualification (EPQ)
My Extended Project Qualification (EPQ)
 
Cameras 3rd Pt
Cameras 3rd PtCameras 3rd Pt
Cameras 3rd Pt
 
Kodak filmmakers (Cine Calidad)
Kodak filmmakers (Cine Calidad)Kodak filmmakers (Cine Calidad)
Kodak filmmakers (Cine Calidad)
 
camera
cameracamera
camera
 
Technology Timeline
Technology TimelineTechnology Timeline
Technology Timeline
 
What i wish everyone knew about
What i wish everyone knew aboutWhat i wish everyone knew about
What i wish everyone knew about
 
Introduction to-photography
Introduction to-photographyIntroduction to-photography
Introduction to-photography
 
History camera2
History camera2History camera2
History camera2
 
History of Film Technology GCSE Film Studies.
History of Film Technology GCSE Film Studies. History of Film Technology GCSE Film Studies.
History of Film Technology GCSE Film Studies.
 
Photography
PhotographyPhotography
Photography
 
02 Fall09 Lecture Sept18web
02 Fall09 Lecture Sept18web02 Fall09 Lecture Sept18web
02 Fall09 Lecture Sept18web
 
Clinical phototgraphy
Clinical phototgraphyClinical phototgraphy
Clinical phototgraphy
 
Ana maria arevalo 6 a tecnologia
Ana maria arevalo 6 a tecnologiaAna maria arevalo 6 a tecnologia
Ana maria arevalo 6 a tecnologia
 

Recently uploaded

Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CVKhem
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?Igalia
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 

Recently uploaded (20)

Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 

Image Sensing and Aquisition

  • 1. i IMAGE SENSING AND AQUISITION SEMINAR REPORT SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF BACHELOR OF TECHNOLOGY IN ELECTRONICS AND COMMUNICATION ENGINEERING BY O.V.S SHASHANK RAM (12SS1A0446) Department of Electronics and Communication Engineering Jawaharlal Nehru Technological University Hyderabad College of Engineering Sultanpur, pulkal (M), Medak-502293 Telangana 2016
  • 2. ii Jawaharlal Nehru Technological University Hyderabad College of Engineering Sultanpur, pulkal (M),Medak-502293 Telangana Department of Electronics and Communication Engineering CERTIFICATE Date: This is to certify that the seminar work entitled “IMAGE SENSING AND AQUISITION” is a bonafide work carried out by “O.V.S SHASHANK RAM” bearing Roll no.12SS1A0446 in partial fulfillment of the requirements of the degree of BACHELOR OF TECHNOLOGY in ELECTRONICS & COMMUNICATION ENGINEERING by the Jawaharlal Nehru Technological University, Hyderabad during the academic year 2015-16. The results embodied in this report have not been submitted to any other University or Institution for the award of any degree or diploma. ------------------------- --------------------- Mr. V. Rajanesh Mr. B. Prabhakar Associate Professor Associate Professor Guide Head of the Department
  • 3. iii Abstract The inclusion of cameras in everything from cell phones to pens to children’s’ toys is possible because of the low cost and low power consumption of the imaging arrays that form the core of the cameras. However, these arrays are low cost and low power because they are CMOS-based; this allows for the devices to be made with the same processes and facilities that are used to make memory and computer chips. Yet, the continued surge in CMOS imager popularity goes beyond the lower cost to other factors such as ability to integrate the sensors with electronics, and the ability to achieve fast, customizable frame rates. People have been using camera and film for more than 100 years, both for still photography and movies. There is something magical about the process -- humans are visual creatures, and a picture really does paint a thousand words for us! In this report we will be discussing how an image is sensed and the acquitted, what is a camera, how it works. Various image sensors are discussed and the differences these sensors, the differences between the analog and digital image sensing is also discussed in detail.
  • 4. iv Contents : 1. Introduction 1 2. Camera 2 3. History of Camera 3 4. Components of a Camera 6 5. Working of a Camera 13 6. Image Sensors 20 7. CCD vs CMOS 28 8. Anolog(Film) vs Digital 38 9. Conclusion 46 References 47
  • 5. 1 Introduction: Before any video or image processing can commence an image must be captured by a camera and converted into a manageable entity. This is whole process is termed as image sensing and acquisition. Image Sensing and acquisition mainly deals with how the image is sensed and then acquired as desired. This the first and foremost in the elements of digital image processing. Only after this important step can we implement any processing on the image. Mostly the electronic device associated with image sensing and acquisition is called as a Camera. As a photographer I have always been interested in the camera as a medium. Darwin’s writings on the evolution of the eye in his “On the Origin of Species” and was struck (as many have been) by its remarkable similarity to the development of photography. Nature did not create the eye fully formed—Darwin demonstrated how this occurred over countless generations in a slow methodical process. It started with a flat disk of light sensitive cells that could detect the presence of light but nothing more. This disk began to dimple which grew deeper to form a cup. As the cup closed over the opening formed an aperture which had enough power to resolve a dim fuzzy image onto the back of the proto eye (still used today by the Nautaist). This cup filled with mucus and an encapsulated lens formed over the opening. We now had the compound eye, a very sharp and bright camera for seeing the world. The analogy to human made cameras is not perfect. We did not progress slowly by dimpling light sensitive paper until it eventually became a Nikon. But cameras have evolved on their own trajectory. Humans were playing with pinhole camera obscuras as written about in ancient Chinese and Greek text, and we had simple lens camera lucida’s by the in the early renascence. So what was the first human made camera? What was equivalent to the flat disk of light sensitive cells? Many theorists have written about the analogy of Plato’s cave in which people trace the shadows cast on the back wall of a cave. The first human made proto cameras could have been present with the first proto humans who watched their shadows dance across the cave walls cast from their fire pits. Photography is undoubtedly one of the most important inventions in history -- it has truly transformed how people conceive of the world. Now we can "see" all sorts of things that are actually many miles -- and years -- away from us. Photography lets us capture moments in time and preserve them for years to come. First let us discuss how a camera operates, the history of the camera technology and as of how the name is originated. Then we will be discussing what is an image sensor and different types of image sensors and their basic operation and finally how an color image is formed.
  • 6. 2 Camera: With a rather gentle introduction, we ask ourselves what a camera really is, and what its different components are. Chances are that you will already know some of this, but going through it anyway will at least ensure that we have defined a common vocabulary. In the strictest sense, it is simply a device which can record light. It does so by focusing light on a photosensitive surface. From this simple sentence, we can see the three main parts of any camera. The basic technology that makes all of this possible is fairly simple. A still film camera is made of three basic elements: an optical element (the lens), a chemical element (the film) and a mechanical element (the camera body itself). As we'll see, the only trick to photography is calibrating and combining these elements in such a way that they record a crisp, recognizable image.
  • 7. 3 History of Camera: 1500 – Camera Obscura The first pinhole camera (also called the Camera Obscura) was invented by Alhazen (Ibn Al-Haytham). Camera Obscura(Latin origin) implies a vaulted(closed) room.Basically a pin hole camera should be in such a way that light enters only through the hole. 1839-Daguerreotype Camera The Daguerreotype Camera was announced by the French Academy of Sciences. One of these inventions is now the world’s most expensive cameras. 1840-First Patent The first American patent issued in photography to Alexander Wolcott for his camera. 1859-Panoramic camera The panoramic camera patented by Thomas Sutton. 1861-Stereoscope viewer Oliver Wendell Holmes invents stereoscope viewer. 1888 George Eastman patents Kodak roll-film
  • 8. 4 camera. Eastman was a pioneer in photographic films usage. He also started manufacturing paper films in 1885. His first Kodak box camera was very simple and very cheap. 1900 First mass-marketed camera – the Brownie was presented by Eastman. It was on sale until 1960s. 1900 The Raisecamera (travel camera) was invented. Extreme light weight and small dimensions when it is folded made this photo camera the most desirable thing for landscape photographers.
  • 9. 5 1913/1914 The first 35mm still camera (also called �candid� camera ) developed by Oskar Barnack of German Leica Camera. Later it became the standard for all film cameras. 1948 Edwin Land invented the Polaroid camera which could take a picture and print it in about one minute. 1960 EG&G develops extreme depth underwater camera for U.S. Navy. 1978 Konica introduces the first point-and-shoot, autofocus camera Konica C35 AF. It was named “Jasupin”. 1981 Sony demonstrates the Sony Mavica – the world’s first digital electronic still camera. 1986 Fuji introduced the disposable camera. The inventors also call this device “single-use cameras”. 1991 Kodak released the first professional digital camera system (DCS) which was of a great use for photojournalists. It was a modified Nikon F-3 camera with a 1.3 megapixel sensor.
  • 10. 6 1994-1996 The first digital cameras for the consumer-level market that worked with a home computer via a serial cable were the Apple QuickTake 100 camera (February 17 , 1994), the Kodak DC40 camera (March 28, 1995), the Casio QV-11 (with LCD monitor, late 1995), and Sony’s Cyber-Shot Digital Still Camera (1996). 2000 In Japane Sharp’s J-SH04 introduced the world’s first camera phone. 2005 The Canon EOS 5D is launched. This is first consumer-priced full-frame digital SLR with a 24x36mm CMOS sensor.
  • 11. 7 Components of Camera: Camera consists of : 1.Lens 2.Aperture 3.Shutter 4.Photo Sensing Element 5.Buffer 6.ISP
  • 12. 8 Debriefed: The photosensitive surface reacts to light through either a chemical process (film) or an electric one (digital sensor). There are fundamental differences between these two, which we will cover in a subsequently, but for now we can consider both of them to be identical: they are a grid of several million tiny dots (pixels) and each can remember how much light it received in a given period of time. There are three important qualities to each sensor: resolution, size and what we can call “quality”.  Resolution is simply the number of pixels (it is slightly more complicated with film, let’s forget about it for now). The more pixels you have, the more fine grained details you can theoretically record. Any resolution above 2 or 3 megapixels (i.e. millions of pixels) will be enough for displaying on a screen, but higher resolutions come into play for two important applications: printing and cropping. o In order to have a good reproduction quality, it is generally estimated that between 240 and 300 pixels should be used for every inch of paper (dots per inch, or dpi), which will give a natural limitation to the biggest size one can print. For instance, a 6MP image of dimensions 2000×3000 pixels can be printed at a maximum size of 12.5×8.3″ at 240dpi (2000/240 = 8.3, 3000/240 = 12.5). It is possible to print bigger by either lowering the dpi or artificially increasing the resolution, but this will come at a serious loss of image quality. Having a higher resolution allows you to print bigger. o Cropping means reducing the size of an image by discarding pixels on the sides. It is a very useful tool and can often improve composition or remove unwanted elements from an image. However, it will also decrease resolution (since you lose pixels), so how much cropping you allow yourself will depend on the initial resolution, which you want to be as high as possible. This is also what some cheaper cameras call “digital zoom”, which use should be avoided as the plague, as the same effect can very easily be reproduced in post-processing, and the loss of image quality is often enormous.  The physical size of the sensor is very important and will have an impact on many other parameters, most of which we will see in subsequent lessons: crop factor, depth of field, high ISO noise, dynamic range are some of them. Bigger sensors will also allow to have more widely spaced pixels (increasing image quality) or more of them (increasing resolution). Bigger is almost always better, and this is one of the main reasons that DSLRs (and medium format cameras) produce much better images than compact cameras. In tomorrow’s lesson, we will cover the different types of cameras in more details.  Finally, sensor quality is harder to quantify, but it refers to how well the sensor reacts to difficult light conditions: either low light which will require to increase ISO and for which we want the sensor to have as little noise as possible, or high contrast, which will require a good dynamic range to be recorded adequately.
  • 13. 9 The lens is the second component of any camera. It is an optical device which takes scattered light rays and focuses them neatly on the sensor. Lenses are often complex, with up to 15 different optical elements serving different roles. The quality of the glass and the precision of the lens will be extremely important in determining how good the final image is. Lenses must make compromises, and a perfect all around lens is physically impossible to build. For this reason, good lenses tend to be specialized and having the ability to switch them on your camera will prove extremely useful. Lenses usually come with cryptic sequences of symbols and numbers which describe their specifications. Without going too much into details, let’s review some of their characteristic:  Focal length refers roughly to the “zoom level”, or angle of view, of the lens. It will have its own lesson in a few days, as it can be a surprisingly tricky subject. A focal length is usually expressed in millimeters, and you should be aware that the resulting angle of view actually depends on the size of the sensor of the camera on which the lens is used (this is called the crop factor). For this reason, we often give “35mm equivalent” focal lengths, which is the focal length that would offer the same view on a 35mm camera (the historic film SLR format) and allows us to make meaningful comparisons. If there is a single length (e.g. 24mm), then the lens doesn’t zoom, and it is often called a prime lens. If there are two numbers (e.g. 18-55mm), then you can use the lens at any focal in that range. Compact cameras often don’t give focal lengths but simply the range, for instance 8x. This means that the long end is 8 times longer than the wide one, so the lens could for instance be a 18-144mm, or a 35-280mm, etc.  The aperture is a very important concept which we will talk about in much detail later on. The aperture is an iris in the centre of the lens which can close to increasingly small sizes, limiting the amount of light which gets on the sensor. It is refered to as a f-number, for instance f/2.8. To make things worse, it is quite counter-intuitive, as the smaller the number, the bigger the aperture! For now, we don’t have to worry about this too much. The important number on a lens is the maximal aperture, the lower the better. Professional zoom
  • 14. 10 lenses often have f/2.8 maximal apertures, and cheaper consumer lenses have ranges such as f/3.5-5.6, meaning that at the wide end, the maximum aperture is f/3.5 and at the long end, it is f/5.6. Aperture can be closed to tiny levels, usually at least f/22.  Lenses also need a focusing system. Nowadays, most lenses have an internal motor which can be piloted by the camera: the autofocus. They also have a ring to allow the photographer to focus manually. There are plenty of options for autofocus motors as well, for instance hypersonic or silent ones.  Lenses are increasingly equiped with stabilisation systems (called VR by Nikon, IS by Canon). They detect small movements, usually handshake, and compensate for them by moving internally the optical elements in the opposite direction. Though no magic pills, those systems tend to work very well and allow to take sharp images at quite slow shutter speeds.  Finally, lenses can have all sorts of fancy options: apochromatic glass, nano- coating, etc, designed to increase the quality of the final image. You probably shouldn’t worry too much about those. Finally, the body is the light tight box connecting the lens to the sensor, and ordering everyone around. Though some film cameras are just that, black boxes, most digital cameras are now small computers, sporting all sorts of features, often of dubious usefulness. Let’s review some of the components found in most bodies:  The most important is probably the shutter. Think of it as a curtain in front of the sensor. When you press the trigger, the curtain opens, exposes the sensor to light from the lens, then closes again after a very precise amount of time, often a tiny fraction of a second. Most shutters operate between 30 seconds and 1/4000s of a second. That duration (the shutter speed) is one of the three very important exposure factors, along with aperture and ISO.  A light meter. As the name suggests, it measures the quantity of light and sets the exposure accordingly. How much manual control you keep at this stage is one of the most important questions in photography. There are different metering modes, but except in very specific cases, using the most advanced, most automated one (matrix metering on Nikon cameras) will provide the best results.  A focus detector, used to drive the autofocus motor in the lens. There are two competing technologies, contrast detection and phase detection, with at the
  • 15. 11 moment an edge for the latter, which explains why DSLRs tend to focus faster than compact cameras.  A way to store the image just created. Back in the days of film, this was just a lever to advance the roll to the next unexposed frame. Now, it is a pipeline which ends up in the memory card that the camera is using.  A way to frame. It can be a multitude of things, optical or electronic viewfinder, LCD screen or even ground glass. The optical component of the camera is the lens. At its simplest, a lens is just a curved piece of glass or plastic. Its job is to take the beams of light bouncing off of an object and redirect them so they come together to form a real image -- an image that looks just like the scene in front of the lens. But how can a piece of glass do this? The process is actually very simple. As light travels from one medium to another, it changes speed. Light travels more quickly through air than it does through glass, so a lens slows it down. When light waves enter a piece of glass at an angle, one part of the wave will reach the glass before another and so will start slowing down first. This is something like pushing a shopping cart from pavement to grass, at an angle. The right wheel hits the grass first and so slows down while the left wheel is still on the pavement. Because the left wheel is briefly moving more quickly than the right wheel, the shopping cart turns to the right as it moves onto the grass. The effect on light is the same -- as it enters the glass at an angle, it bends in one
  • 16. 12 direction. It bends again when it exits the glass because parts of the light wave enter the air and speed up before other parts of the wave. In a standard converging, or convex lens, one or both sides of the glass curves out. This means rays of light passing through will bend toward the center of the lens on entry. In adouble convex lens, such as a magnifying glass, the light will bend when it exits as well as when it enters. This effectively reverses the path of light from an object. A light source -- say a candle -- emits light in all directions. The rays of light all start at the same point -- the candle's flame -- and then are constantly diverging. A converging lens takes those rays and redirects them so they are all converging back to one point. At the point where the rays converge, you get a real image of the candle. In the next couple of sections, we'll look at some of the variables that determine how this real image is formed. Cameras: Focus We've seen that a real image is formed by light moving through a convex lens. The nature of this real image varies depending on how the light travels through the lens. This light path depends on two major factors:  The angle of the light beam's entry into the lens  The structure of the lensThe angle of light entry changes when you move the object closer or farther away from the lens. You can see this in the diagram below. The light beams from the pencil point enter the lens at a sharper angle when the pencil is closer to the lens and a more obtuse angle when the pencil is farther away. But overall, the lens only bends the light beam to a certain total degree, no matter how it enters. Consequently, light beams that enter at a sharper angle will exit at a more obtuse angle, and vice versa.
  • 17. 13 The total "bending angle" at any particular point on the lens remains constant. As you can see, light beams from a closer point converge farther away from the lens than light beams from a point that's farther away. In other words, the real image of a closer object forms farther away from the lens than the real image from a more distant object. You can observe this phenomenon with a simple experiment. Light a candle in the dark, and hold a magnifying glass between it and the wall. You will see an upside down image of the candle on the wall. If the real image of the candle does not fall directly on the wall, it will appear somewhat blurry. The light beams from a particular point don't quite converge at this point. To focus the image, move the magnifying glass closer or farther away from the candle. This is what you're doing when you turn the lens of a camera to focus it -- you're moving it closer or farther away from the film surface. As you move the lens, you can line up the focused real image of an object so it falls directly on the film surface. You now know that at any one point, a lens bends light beams to a certain total degree, no matter the light beam's angle of entry. This total "bending angle" is determined by the structure of the lens.
  • 18. 14 Camera Lenses: A standard 50 mm lens doesn't significantly shrink or magnify the image. In the last section, we saw that at any one point, a lens bends light beams to a certain total degree, no matter the light beam's angle of entry. This total "bending angle" is determined by the structure of the lens. A lens with a rounder shape (a center that extends out farther) will have a more acute bending angle. Basically, curving the lens out increases the distance between different points on the lens. This increases the amount of time that one part of the light wave is moving faster than another part, so the light makes a sharper turn. Increasing the bending angle has an obvious effect. Light beams from a particular point will converge at a point closer to the lens. In a lens with a flatter shape, light beams will not turn as sharply. Consequently, the light beams will converge farther away from the lens. To put it another way, the focused real image forms farther away from the lens when the lens has a flatter surface. Increasing the distance between the lens and the real image actually increases the total size of the real image. If you think about it, this makes perfect sense. Think of a projector: As you move the projector farther away from the screen, the image becomes larger. To put it simply, the light beams keep spreading apart as they travel toward the screen. The same basic thing happens in a camera. As the distance between the lens and the real image increases, the light beams spread out more, forming a larger real image. But the size of the film stays constant. When you attach a very flat lens, it projects a large real image but the film is only exposed to the middle part of it. Basically, the lens zeroes in on the middle of the frame, magnifying a small section of the scene in front of you. A rounder lens produces a smaller real image, so the film surface sees a much wider area of the scene (at reduced magnification).
  • 19. 15 Professional cameras let you attach different lenses so you can see the scene at various magnifications. The magnification power of a lens is described by itsfocal length. In cameras, the focal length is defined as the distance between the lens and the real image of an object in the far distance (the moon for example). A higher focal length number indicates a greater image magnification. Different lenses are suited to different situations. If you're taking a picture of a mountain range, you might want to use a telephoto lens, a lens with an especially long focal length. This lens lets you zero in on specific elements in the distance, so you can create tighter compositions. If you're taking a close-up portrait, you might use a wide-angle lens. This lens has a much shorter focal length, so it shrinks the scene in front of you. The entire face is exposed to the film even if the subject is only a foot away from the camera. A standard 50 mm camera lens doesn't significantly magnify or shrink the image, making it ideal for shooting objects that aren't especially close or far away. Lenses in the Lens A camera lens is actually several lenses combined into one unit. A single converging lens could form a real image on the film, but it would be warped by a number of aberrations. One of the most significant warping factors is that different colors of light bend differently when moving through a lens. This chromatic aberrationessentially produces an image where the colors are not lined up correctly. Cameras compensate for this using several lenses made of different materials. The lenses each handle colors differently, and when you combine them in a certain way, the colors are realigned. In a zoom lens, you can move different lens elements back and forth. By changing the distance between particular lenses, you can adjust the magnification power -- the focal length -- of the lens as a whole. Cameras: Recording Light The chemical component in a traditional camera is film. Essentially, when you expose film to a real image, it makes a chemical record of the pattern of light. It does this with a collection of tiny light-sensitive grains, spread out in a chemical suspension on a strip of plastic. When exposed to light, the grains undergo a chemical reaction. Once the roll is finished, the film is developed -- it is exposed to other chemicals, which react with the light-sensitive grains. In black and white film, the developer chemicals darken the grains that were exposed to light. This produces a negative, where lighter areas appear darker and darker areas appear lighter, which is then converted into a positive image in printing. Color film has three different layers of light-sensitive materials, which respond, in turn, to red, green and blue. When the film is developed, these layers are exposed to chemicals that dye the layers of film. When you overlay the color information from all three layers, you get a full-color negative.
  • 20. 16 For an in-depth description of this entire process, check out How Photographic Film Works. So far, we've looked at the basic idea of photography -- you create a real image with a converging lens, and you record the light pattern of this real image on a layer of light-sensitive material. Conceptually, this is all that's involved in taking a picture. But to capture a clear image, you have to carefully control how everything comes together. Obviously, if you were to lay a piece of film on the ground and focus a real image onto it with a converging lens, you wouldn't get any kind of usable picture. Out in the open, every grain in the film would be completely exposed to light. And without any contrasting unexposed areas, there's no picture. To capture an image, you have to keep the film in complete darkness until it's time to take the picture. Then, when you want to record an image, you let some light in. At its most basic level, this is all the body of a camera is -- a sealed box with a shutter that opens and closes between the lens and film. In fact, the term camera is shortened from camera obscura, literally "dark room" in Latin. For the picture to come out right, you have to precisely control how much light hits the film. If you let too much light in, too many grains will react, and the picture will appear washed out. If you don't let enough light hit the film, too few grains will react, and the picture will be too dark. In the next section, we'll look at the different camera mechanisms that let you adjust the exposure. What's in a Name? As it turns out, the term photography describes the photographic process quite accurately. Sir John Herschel, a 19th century astronomer and one of the first photographers, came up with the term in 1839. The term is a combination of two Greek words -- photos meaning light and graphein meaning writing (or drawing). The term camera comes from camera obscura, Latin for "dark room." The camera obscura was actually invented hundreds of years before photography. A traditional camera obscura was a dark room with light shining through a lens or tiny hole in the wall. Light passed through the hole, forming an upside-down real image on the opposite wall. This effect was very popular with artists, scientists and curious spectators.
  • 21. 17 Cameras: The Right Light The plates in the iris diaphragm fold in on each other to shrink the aperture and expand out to make it wider. In the last section, we saw that you need to carefully control the film's exposure to light, or your picture will come out too dark or too bright. So how do you adjust this exposure level? You have to consider two major factors:  How much light is passing through the lens  How long the film is exposed To increase or decrease the amount of light passing through the lens, you have to change the size of the aperture -- the lens opening. This is the job of the iris diaphragm, a series of overlapping metal plates that can fold in on each other or expand out. Essentially, this mechanism works the same way as the iris in your eye -- it opens or closes in a circle, to shrink or expand the diameter of the lens. When the lens is smaller, it captures less light, and when it is larger, it captures more light. The length of exposure is determined by the shutter speed. Most SLR cameras use a focal plane shutter. This mechanism is very simple -- it basically consists of two "curtains" between the lens and the film. Before you take a picture, the first curtain is closed, so the film won't be exposed to light. When you take the picture, this curtain slides open. After a certain amount of time, the second curtain slides in from the other side, to stop the exposure. When you click the camera's shutter release, the first curtain slides open, exposing the film. After a certain amount of time, the second shutter slides closed, ending the exposure. The time delay is controlled by the camera's shutter speed knob. This simple action is controlled by a complex mass of gears, switches and springs, like you might find inside a watch. When you hit the shutter button, it releases a lever, which sets several gears in motion. You can tighten or loosen some of the springs by turning the shutter speed knob. This adjusts the gear mechanism, increasing or decreasing the delay between the first curtain opening and the second curtain closing. When you set the knob to a very slow shutter speed, the shutter is open for a very long time. When you set the knob to a very high speed, the second
  • 22. 18 curtain follows directly behind the first curtain, so only a tiny slit of the film frame is exposed at any one time. The ideal exposure depends on the size of the light-sensitive grains in the film. A larger grain is more likely to absorb light photons than a smaller grain. The size of the grains is indicated by a film's speed, which is printed on the canister. Different film speeds are suited to different types of photography -- 100 ISO film, for example, is optimal for shots in bright sunlight, while 1600 film should only be used in relatively low light. Inside a manual SLR camera, you'll find an intricate puzzle of gears and springs. Click on each picture for a high-resolution close-up shot. As you can see, there's a lot involved in getting the exposure right -- you have to balance film speed, aperture size and shutter speed to fit the light level in your shot. Manual SLR cameras have a built-in light meter to help you do this. The main component of the light meter is a panel of semi-conductor light sensors that are sensitive to light energy. These sensors express this light energy as electrical energy, which the light meter system interprets based on the film and shutter speed. Now, let's see how an SLR camera body directs the real image to the viewfinder before you take the shot, and then directs it to the film when you press the shutter button. SLR Cameras vs. Point-and-Shoot There are two types of consumer film cameras on the market -- SLR cameras and "point-and-shoot" cameras. The main difference is how the photographer sees the scene. In a point-and-shoot camera, the viewfinder is a simple window through the body of the camera. You don't see the real image formed by the camera lens, but you get a rough idea of what is in view.
  • 23. 19 In an SLR camera, you see the actual real image that the film will see. If you take the lens off of an SLR camera and look inside, you'll see how this works. The camera has a slanted mirror positioned between the shutter and the lens, with a piece of translucent glass and a prism positioned above it. This configuration works like a periscope -- the real image bounces off the lower mirror on to the translucent glass, which serves as a projection screen. The prism's job is to flip the image on the screen, so it appears right side up again, and redirect it on to the viewfinder window. When you click the shutter button, the camera quickly switches the mirror out of the way, so the image is directed at the exposed film. The mirror is connected to the shutter timer system, so it stays open as long as the shutter is open. This is why the viewfinder is suddenly blacked out when you take a picture. The mirror in an SLR camera directs the real image to the viewfinder. When you hit the shutter button, the mirror flips up so the real image is projected onto the film. In this sort of camera, the mirror and the translucent screen are set up so they present the real image exactly as it will appear on the film. The advantage of this design is that you can adjust the focus and compose the scene so you get exactly the picture you want. For this reason, professional photographers typically use SLR cameras. These days, most SLR cameras are built with both manual and automatic controls, and most point-and-shoot cameras are fully automatic. Conceptually, automatic cameras are pretty much the same as fully manual models, but everything is controlled by a central microprocessor instead of the user. The central microprocessor receives information from the autofocus system and the light meter. Then it activates several small motors, which adjust the lens and open and close the aperture. In modern cameras, this a pretty advanced computer system. Automatic point-and-shoot camera use circuit boards and electric motors, instead of gears and springs. In the next section, we'll look at the other end of the spectrum -- a camera design with no
  • 24. 20 complex machinery, no lens and barely any moving parts. Throughout the history of photography, there have been hundreds of different camera systems. But amazingly, all these designs -- from the simplest homemade box camera to the newest digital camera -- combine the same basic elements: a lens system to create the real image, a light-sensitive sensor to record the real image, and a mechanical system to control how the real image is exposed to the sensor. And when you get down to it, that's all there is to photography! Types of Digital Image Sensors :
  • 25. 21 Working of Camera: In the past twenty years, most of the major technological breakthroughs in consumer electronics have really been part of one larger breakthrough. When you get down to it, CDs, DVDs, HDTV, MP3s and DVRs are all built around the same basic process: converting conventional analog information (represented by a fluctuating wave) into digital information (represented by ones and zeros, or bits). This fundamental shift in technology totally changed how we handle visual and audio information -- it completely redefined what is possible. The digital camera is one of the most remarkable instances of this shift because it is so truly different from its predecessor.Conventional cameras depend entirely on chemical and mechanical processes -- you don't even need electricity to operate them. On the other hand, all digital cameras have a built-in computer, and all of them record images electronically. The new approach has been enormously successful. Since film still provides better picture quality, digital cameras have not completely replaced conventional cameras. But, as digital imaging technology has improved, digital cameras have rapidly become more popular. In this article, we'll find out exactly what's going on inside these amazing digital-age devices. Digital Camera Basics Let's say you want to take a picture and e-mail it to a friend. To do this, you need the image to be represented in the language that computers recognize -- bits and bytes. Essentially, a digital image is just a long string of 1s and 0s that represent all the tiny colored dots -- or pixels -- that collectively make up the image. (For information on sampling and digital representations of data, see this explanation of the digitization of sound waves. Digitizing light waves works in a similar way.) If you want to get a picture into this form, you have two options:  You can take a photograph using a conventional film camera, process the film chemically, print it onto photographic paper and then use adigital scanner to sample the print (record the pattern of light as a series of pixel values).  You can directly sample the original light that bounces off your subject, immediately breaking that light pattern down into a series of pixel values -- in other words, you can use a digital camera. At its most basic level, this is all there is to a digital camera. Just like a conventional camera, it has a series of lenses that focus light to create an image of a scene. But instead of focusing this light onto a piece of film, it focuses it onto a semiconductor device that records light electronically. A computer then breaks this electronic information down into digital data. All the fun and interesting features of digital cameras come as a direct result of this process. In the next few sections, we'll find out exactly how the camera does all this.
  • 26. 22 Cool Facts  With a 3-megapixel camera, you can take a higher-resolution picture than most computer monitors can display.  You can use your Web browser to view digital pictures taken using the JPEG format.  The first consumer-oriented digital cameras were sold by Kodak and Apple in 1994.  In 1998, Sony inadvertently sold more than 700,000 camcorders with a limited ability to see through clothes. CCD and CMOS: Filmless Cameras A CMOS image sensor Instead of film, a digital camera has a sensor that converts light into electrical charges. The image sensor employed by most digital cameras is a charge coupled device (CCD). Some cameras use complementary metal oxide semiconductor (CMOS) technology instead. Both CCD and CMOS image sensors convert light into electrons. If you've read How Solar Cells Work, you already understand one of the pieces of technology used to perform the conversion. A simplified way to think about these sensors is to think of a 2-D array of thousands or millions of tiny solar cells. Once the sensor converts the light into electrons, it reads the value (accumulated charge) of each cell in the image. This is where the differences between the two main sensor types kick in:  A CCD transports the charge across the chip and reads it at one corner of the array. An analog-to-digital converter (ADC)then turns each pixel's value into a digital value by measuring the amount of charge at each photosite and converting that measurement to binary form.  CMOS devices use several transistors at each pixel to amplify and move the charge using more traditional wires. Differences between the two types of sensors lead to a number of pros and cons:
  • 27. 23 A CCD sensor PHOTO COURTESY DALSA  CCD sensors create high-quality, low-noise images. CMOS sensors are generally more susceptible to noise.  Because each pixel on a CMOS sensor has several transistors located next to it, the light sensitivity of a CMOS chip is lower. Many of the photons hit the transistors instead of the photodiode.  CMOS sensors traditionally consume little power. CCDs, on the other hand, use a process that consumes lots of power. CCDs consume as much as 100 times more power than an equivalent CMOS sensor.  CCD sensors have been mass produced for a longer period of time, so they are more mature. They tend to have higher quality pixels, and more of them. Although numerous differences exist between the two sensors, they both play the same role in the camera -- they turn light into electricity. For the purpose of understanding how a digital camera works, you can think of them as nearly identical devices. Digital Camera Resolution The size of an image taken at different resolutions PHOTO COURTESY MORGUEFILE
  • 28. 24 The amount of detail that the camera can capture is called the resolution, and it is measured in pixels. The more pixels a camera has, the more detail it can capture and the larger pictures can be without becoming blurry or "grainy." Some typical resolutions include:  256x256 - Found on very cheap cameras, this resolution is so low that the picture quality is almost always unacceptable. This is 65,000 total pixels.  640x480 - This is the low end on most "real" cameras. This resolution is ideal for e-mailing pictures or posting pictures on a Web site.  1216x912 - This is a "megapixel" image size -- 1,109,000 total pixels -- good for printing pictures.  1600x1200 - With almost 2 million total pixels, this is "high resolution." You can print a 4x5 inch print taken at this resolution with the same quality that you would get from a photo lab.  2240x1680 - Found on 4 megapixel cameras -- the current standard -- this allows even larger printed photos, with good quality for prints up to 16x20 inches.  4064x2704 - A top-of-the-line digital camera with 11.1 megapixels takes pictures at this resolution. At this setting, you can create 13.5x9 inch prints with no loss of picture quality. High-end consumer cameras can capture over 12 million pixels. Some professional cameras support over 16 million pixels, or 20 million pixels for large- format cameras. For comparison, Hewlett Packard estimates that the quality of 35mm film is about 20 million pixels [ref]. Next, we'll look at how the camera adds color to these images. How Many Pixels? You may have noticed that the number of pixels and the maximum resolution don't quite compute. For example, a 2.1-megapixel camera can produce images with a resolution of 1600x1200, or 1,920,000 pixels. But "2.1 megapixel" means there should be at least 2,100,000 pixels. This isn't an error from rounding off or binary mathematical trickery. There is a real discrepancy between these numbers because the CCD has to include circuitry for the ADC to measure the charge. This circuitry is dyed black so that it doesn't absorb light and distort the image.
  • 29. 25 Capturing Color How the original (left) image is split in a beam splitter Unfortunately, each photosite is colorblind. It only keeps track of the total intensity of the light that strikes its surface. In order to get a full color image, most sensors use filtering to look at the light in its three primary colors. Once the camera records all three colors, it combines them to create the full spectrum. There are several ways of recording the three colors in a digital camera. The highest quality cameras use three separate sensors, each with a different filter. A beam splitter directs light to the different sensors. Think of the light entering the camera as water flowing through a pipe. Using a beam splitter would be like dividing an identical amount of water into three different pipes. Each sensor gets an identical look at the image; but because of the filters, each sensor only responds to one of the primary colors. The advantage of this method is that the camera records each of the three colors at each pixel location. Unfortunately, cameras that use this method tend to be bulky and expensive. Another method is to rotate a series of red, blue and green filters in front of a single sensor. The sensor records three separate images in rapid succession. This method also provides information on all three colors at each pixel location; but since the three images aren't taken at precisely the same moment, both the camera and the target of the photo must remain stationary for all three readings. This isn't practical for candid photography or handheld cameras. Both of these methods work well for professional studio cameras, but they're not necessarily practical for casual snapshots. Next, we'll look at filtering methods that are more suited to small, efficient cameras.
  • 30. 26 Demosaicing Algorithms: Color Filtering A more economical and practical way to record the primary colors is to permanently place a filter called a color filter array over each individual photosite. By breaking up the sensor into a variety of red, blue and green pixels, it is possible to get enough information in the general vicinity of each sensor to make very accurate guesses about the true color at that location. This process of looking at the other pixels in the neighborhood of a sensor and making an educated guess is called interpolation. The most common pattern of filters is the Bayer filter pattern. This pattern alternates a row of red and green filters with a row of blue and green filters. The pixels are not evenly divided -- there are as many green pixels as there are blue and red combined. This is because the human eye is not equally sensitive to all three colors. It's necessary to include more information from the green pixels in order to create an image that the eye will perceive as a "true color." The advantages of this method are that only one sensor is required, and all the color information (red, green and blue) is recorded at the same moment. That means the camera can be smaller, cheaper, and useful in a wider variety of situations. The raw output from a sensor with a Bayer filter is a mosaic of red, green and blue pixels of different intensity. Digital cameras use specialized demosaicing algorithms to convert this mosaic into an equally sized mosaic of true colors. The key is that each colored pixel can be used more than once. The true color of a single pixel can be determined by averaging the values from the closest surrounding pixels. Some single-sensor cameras use alternatives to the Bayer filter pattern. X3 technology, for example, embeds red, green and blue photodetectors in silicon. Some of the more advanced cameras subtract values using the typesetting colors cyan, yellow, green and magenta instead of blending red, green and blue. There is even a method that uses two sensors. However, most consumer cameras on the market today use a single sensor with alternating rows of green/red and green/blue filters. Digital Camera Exposure and Focus Just as with film, a digital camera has to control the amount of light that reaches the sensor. The two components it uses to do this, the aperture and shutter speed, are also present on conventional cameras.  Aperture: The size of the opening in the camera. The aperture is automatic in most digital cameras, but some allow manual adjustment to give professionals and hobbyists more control over the final image.
  • 31. 27  Shutter speed: The amount of time that light can pass through the aperture. Unlike film, the light sensor in a digital camera can be reset electronically, so digital cameras have a digital shutter rather than a mechanical shutter. These two aspects work together to capture the amount of light needed to make a good image. In photographic terms, they set the exposure of the sensor. You can learn more about a camera's aperture and shutter speed in How Cameras Work. In addition to controlling the amount of light, the camera has to adjust the lenses to control how the light is focused on the sensor. In general, the lenses on digital cameras are very similar to conventional camera lenses -- some digital cameras can even use conventional lenses. Most use automatic focusing techniques, which you can learn more about in the article How Autofocus Cameras Work. The focal length, however, is one important difference between the lens of a digital camera and the lens of a 35mm camera. The focal length is the distance between the lens and the surface of the sensor. Sensors from different manufacturers vary widely in size, but in general they're smaller than a piece of 35mm film. In order to project the image onto a smaller sensor, the focal length is shortened by the same proportion. For additional information on sensor sizes and comparisons to 35mm film, you can visit the Photo.net Web site. Focal length also determines the magnification, or zoom, when you look through the camera. In 35mm cameras, a 50mm lens gives a natural view of the subject. Increasing the focal length increases the magnification, and objects appear to get closer. The reverse happens when decreasing the focal length. A zoom lens is any lens that has an adjustable focal length, and digital cameras can have optical or digital zoom -- some have both. Some cameras also havemacro focusing capability, meaning that the camera can take pictures from very close to the subject. Digital cameras have one of four types of lenses:  Fixed-focus, fixed-zoom lenses - These are the kinds of lenses on disposable and inexpensive film cameras -- inexpensive and great for snapshots, but fairly limited.  Optical-zoom lenses with automatic focus - Similar to the lens on a video camcorder, these have "wide" and "telephoto" options and automatic focus. The camera may or may not support manual focus. These actually change the focal length of the lens rather than just magnifying the information that hits the sensor.  Digital zoom - With digital zoom, the camera takes pixels from the center of the image sensor and interpolates them to make a full-sized image. Depending on the resolution of the image and the sensor, this approach may create a grainy or fuzzy image. You can manually do the same thing with image processing software -- simply snap a picture, cut out the center and magnify it.  Replaceable lens systems - These are similar to the replaceable lenses on a 35mm camera. Some digital cameras can use 35mm camera lenses.
  • 32. 28 Next, we'll learn about how the camera stores pictures and transfers them to a computer. Storing Digital Photos A CompactFlash card Most digital cameras have an LCD screen, so you can view your picture right away. This is one of the great advantages of a digital camera -- you get immediate feedback on what you capture. Of course, viewing the image on your camera would lose its charm if that's all you could do. You want to be able to load the picture into your computer or send it directly to a printer. There are several ways to do this. Early generations of digital cameras had fixed storage inside the camera. You needed to connect the camera directly to a computer with cables to transfer the images. Although most of today's cameras are capable of connecting through serial,parallel, SCSI, USB or FireWire connections, they usually also use some sort of removable storage device. Digital cameras use a number of storage systems. These are like reusable, digital film, and they use a caddy or card reader to transfer the data to a computer. Many involve fixed or removable flash memory. Digital camera manufacturers often develop their own proprietary flash memory devices, including SmartMedia cards, CompactFlash cards and Memory Sticks. Some other removable storage devices include:  Floppy disks  Hard disks, or microdrives  Writeable CDs and DVDs No matter what type of storage they use, all digital cameras need lots of room for pictures. They usually store images in one of two formats -- TIFF, which is uncompressed, and JPEG, which is compressed, but some use RAW format. Most cameras use the JPEG file format for storing pictures, and they sometimes offer quality settings (such as medium or high). The following information will give you an idea of the file sizes you might expect with different picture sizes. 640x480  TIFF (uncompressed) 1.0 MB  JPEG (high quality) 300 KB  JPEG (medium quality) 90 KB 800x600  TIFF (uncompressed) 1.5 MB
  • 33. 29  JPEG (high quality) 500 KB  JPEG (medium quality) 130 KB 1024x768  TIFF (uncompressed) 2.5 MB  JPEG (high quality) 800 KB  JPEG (medium quality) 200 KB 1600x1200  TIFF (uncompressed) 6.0 MB  JPEG (high quality) 1.7 MB  JPEG (medium quality) 420 KB To make the most of their storage space, almost all digital cameras use some sort of data compression to make the files smaller. Two features of digital images make compression possible. One is repetition. The other is irrelevancy. Imagine that throughout a given photo, certain patterns develop in the colors. For example, if a blue sky takes up 30 percent of the photograph, you can be certain that some shades of blue are going to be repeated over and over again. When compression routines take advantage of patterns that repeat, there is no loss of information and the image can be reconstructed exactly as it was recorded. Unfortunately, this doesn't reduce files any more than 50 percent, and sometimes it doesn't even come close to that level. Irrelevancy is a trickier issue. A digital camera records more information than the human eye can easily detect. Some compression routines take advantage of this fact to throw away some of the more meaningless data. Next, we'll tie it all together and see how a digital camera takes a picture. CCD Camera Summary It takes several steps for a digital camera to take a picture. Here's a review of what happens in a CCD camera, from beginning to end:  You aim the camera at the subject and adjust the optical zoom to get closer or farther away.  You press lightly on the shutter release.  The camera automatically focuses on the subject and takes a reading of the available light.  The camera sets the aperture and shutter speed for optimal exposure.  You press the shutter release all the way.  The camera resets the CCD and exposes it to the light, building up an electrical charge, until the shutter closes.
  • 34. 30  The ADC measures the charge and creates a digital signal that represents the values of the charge at each pixel.  A processor interpolates the data from the different pixels to create natural color. On many cameras, it is possible to see the output on the LCD at this stage.  A processor may perform a preset level of compression on the data.  The information is stored in some form of memory device (probably a Flash memory card). A CCD Image Sensor: Advent of CMOS technology in eighties led to the phenomenal growth in semiconductor industry. Transistors have become smaller, faster, consume less power, and are cheaper to manufacture. It is CMOS technology which has enabled very high integration on the chips leading to modern high performance, miniaturized integrated circuits. Apart from the valuable contribution in miniaturization of integrated circuits, CMOS technology found applications in sensing applications. CMOS technology has been adopted to design sensors, especially in the field of imaging. Due to the wide usage of CMOS based image sensors, CMOS sensors are
  • 35. 31 often considered to be a synonym of CMOS based image sensors and have emerged as a competitor to CCD based image sensors. Until recently, Charge Coupled Devices (CCDs) dominated most of the image sensing systems, i.e., cameras, camcorders, etc. CCDs have been in use in astronomical cameras, video camcorders and scanners. However of late, CMOS Imaging have emerged as an alternative to CCD imagers and it also offers better features. Subsequent sections will discuss both CCD and CMOS sensor based imagers, their pros and cons, and also their applications. Further, other applications of CMOS technology in the field of sensing will be discussed.
  • 36. 32 CMOS Vs CCD Invention of CCD marked the end of vacuum tube imagers used in television cameras as it overcame the disadvantages of vacuum tubes like chronic picture artifacts as lag and burn-in, fragility of large glass tubes or the sensitivity to shock, vibration and electromagnetic radiation, painstaking periodic alignment of tubes, etc. It also marked the beginning of a new era in imaging systems and for decades, it enjoyed quality advantages over the rival CMOS sensors. Wherever image quality was paramount, CCDs were preferred, CMOS were used mainly in applications where small size and low power were prime requirements. With the technological development in CMOS technology, gap between CCD and CMOS sensors has narrowed; CMOS sensors can also achieve competitive quality. Choice amongst CCD and CMOS sensors has become increasingly difficult. Both CCD and CMOS image sensors use large arrays of thousands (sometimes millions) of photo-sites, commonly called pixels. Both carry out same steps. 1. Light-to-charge conversion Incident light is directed by the microlens (a tiny lens placed over the pixel to increase its effective size and thereby fill factor) onto the photo-sensitive area of each pixel where it is converted into electrons that collect in a semiconductor "bucket." The bigger the pixel, the more light it can collect. Thus, big pixel sensors work best under low-light conditions. For the same number of pixels, bigger pixels results in bigger chip, this means higher cost. Conversely, smaller pixels enable smaller chip sizes and lower chip prices, as well as lower lens costs. But there are limitations on
  • 37. 33 pixel size reduction. Smaller pixels are less sensitive to light, the optics required to resolve the pixels becomes expensive and requires expensive fabrication possesses. 2. Charge accumulation As more light enters, more electrons accumulate into the bucket. 3. Transfer Accumulated charge must be transferred to the signal conditioning and processing circuitry. 4. Charge-to-voltage conversion The accumulated charge must be output as the voltage signal. 5. Amplification Voltage signal is then amplified before it is fed to the camera circuitry.Both CMOS and CCD perform all these tasks; however the aspect in which they differ is the order of execution of these tasks. BRIEF ON CCD TECHNOLOGY CCDs were first invented in 1969 as a way to store data using bubble memory. In 1974, the first imaging CCD was produced by Fairchild Electronics with a format of 100x100 pixels. CCD imager consists of two main parts: color filter and pixel array • Color filter Micro-lenses funnel light onto the photo-sensitive part of each pixel. On their way, the photons pass through a color filter array. The mosaic of these tiny filters captures color information. Color filters enable separate measurement of the red (R), green (G) and blue (B) photons. Color filter filters out wavelengths of unwanted colors and allows only specific colors of light to pass through a pixel sensor. For this purpose, each pixel is covered with a red, green and a blue filter according to a specific pattern, like the Bayer CFA pattern. Bayer filter uses the sub- mosaic 2x 2 patterns with one red, one blue and two green filters. As human’s eye has greater sensitivity for green light, two green filters are used.
  • 38. 34 • Pixel Array The pixel array functions on the principle of the photoelectric effect and pixel sensors are responsible for capturing the intensity of the light passing through. The light intensity data is combined before being converted into an analog voltage signal, which is outputted to an external circuit board to be further processed. After conversion of incident light into electrons, electron charge is accumulated in the same way as bucket stores water. The pixel charges are read using vertical and horizontal shift registers which act as charge carriers. CMOS SENSORS A typical CMOS is an integrated circuit with an array of pixel sensors. In contrast to CCD, each pixel sensor in CMOS sensors contains its own light sensor, an amplifier and a pixel select switch. An analog-to-digital converter and other components critical to the operation of the pixel sensors are located on the CMOS sensor. The CMOS sensor contains four main parts: the color filters, the pixel array, the digital controller, and the analog to digital convertor. • Color Filter Color filter is the same as was described in CCD based imager. • Pixel Array As in the case of CCD, function of the pixel array is to capture the intensity of the light passing through. Each pixel sensor converts the sensitivity of the incoming light to the voltage signal which is then fed to ADC for further processing There are two types of architectures of Pixel sensors: Passive Pixel Sensor (PPS) & Active Pixel Sensors (APS). In Passive Pixel sensors, only one photo-detector (without any local amplifier) per pixel is used, whereas in Active Pixel sensors, 3-4 transistors per pixel are used. Passive Pixel sensors have smaller pixels and large fill
  • 39. 35 factor but they are slow and have low SNR. On the other hand, active pixel sensors are fast, have good SNRs but larger pixels and low fill factor. However, due to advancement of CMOS technology down to nm, pixel size/fill factor is no longer a big issue and APS is the technology which is preferred and used in most devices. • ADC The ADC takes the analog voltage signals from the pixel sensor array and converts them into a digital signal. • Digital Controller The digital controller governs the functioning of the the CMOS sensor; it controls the pixel array, ensures synchronism between all pixels, etc. Operation of CMOS Sensors a) Pixel sensor acts like a charge bucket; accumulates electron charges the same way as water bucket stores water b) Charge is converted to voltage & amplified at the pixel. c) Individual CMOS microwire carry voltage from one pixel at a time, controlled by the pixel select switch d) To output video signal, following steps are followed 1. All pixel select switches are turned ON. This outputs voltage of each pixel to column circuit. 2. Column select switches are turned ON from left to right. In this way, signal voltages of each pixel in the same row are output in order. 3. This is repeated for all rows from the top to the bottom in order, signal voltages of all pixels can be output from the top-left corner to the bottom-right corner of the image sensor. e) These signal voltages are output to the signal processor of the camera.
  • 40. 36 CMOS SENSOR TYPES Difference between types of CMOS sensors is generally due to the number of transistors (affecting fill factor) that are present for each pixel. A portion of the pixel sensor that is actually sensitive to light is called fill factor. a) Rolling Shutter type This has got limited number of transistors and therefore has a high fill factor. However, lines of pixels are exposed at different times and therefore, movement in the target gives a distorted image. b) Global Shutter type The number of transistors is high in this case resulting in a low fill factor. But, all the pixels are exposed at a time and thus the movement artifacts associated with rolling shutter type sensors are removed. CCD AND CMOS SENSORS: PROS AND CONS 1. Fabrication Process CCD sensors use specialized fabrication that uses dedicated and costly manufacturing processes, whereas CMOS sensors rely on standard CMOS technology (used for IC fabrication like microprocessors, memory, etc.). As CMOS sensors can also integrate required electronics on the same chip, CMOS sensors results in compact and cost effective system 2. Dynamic Range Dynamic range of CCD is roughly twice as that of CMOS sensor. This implies that if better colour depth is required, CCDs are likely to offer better results. On the other hand, CMOS are marginally more photosensitive.
  • 41. 37 3. Power Consumption CMOS cameras have lower power consumption than CCDs but other CMOS circuitry may require more power. Low end CMOS sensors have low power requirements, but high speed CMOS cameras typically require more power than CCDs. 4. Noise Two types of noise affect sensors’ performance: Temporal Noise and Fixed pattern noise. Fixed pattern Noise is more in CMOS, compared to CCDs because charge is converted to voltage at each pixel as compared to single point charge-voltage conversion in CCDs. In terms of temporal noise, CMOS sensors are better as the bandwidth of amplifiers at each pixel is lower than the output amplifier in case of CCD. 5. Image Quality Due to poor fill factor of CMOS, photosensitivity of CMOS sensors is poor in low light conditions. 6. Uniformity of response CCDs use single amplifier for all pixels and CMOS use separate amplifiers for each pixel. Pixel-to-pixel amplification differences lead to non-uniformity. Response of CCDs is pretty uniform. 7. Speed CMOS sensors have higher speed due to the fact that it uses active pixels and ADCs on same chip leading to lesser propagation delays. 8. Readout area CMOS sensors allow any region or even multiple regions to be read off the sensor. CCDs are limited by vertical scan read out 9. Smart functions With the integration of signal processing circuitry on the CMOS sensor chip, functions like auto gain control, auto exposure control etc., anti-jitter, image compression, color encoding, motion tracking, etc. can be incorporated on-chip.
  • 42. 38 10. Overexposure effect Overexposure can cause smearing around over-exposed pixels. Smearing is caused by spilling of charge into the shift register. Due to absence of shift registers in CMOS sensors, they are immune to this effect.       A Cmos Image Sensor :
  • 43. 39 Film vs. Digital: A Comparison of the Advantages and Disadvantages In a world in which photographs are primarily taken with digital image sensors, there are a growing number of photographers who are newly interested in film formats of the past. But why would anyone in our age of technological convenience still choose to shoot with analog film? To understand the advantages and disadvantages of each shooting practice, we are comparing the different aspects of each’s image quality, along with the cost of usage. If you have been thinking of tinkering with film photography, you have landed in the right place. Resolution When it comes to both digital and analog formats, photographers want to know that their efforts will result in sharp, high-resolution photographs. With digital
  • 44. 40 image sensors, we determine resolution by counting the number of pixels within a given area. Film does not have pixels, and thus an analysis of a film’s resolving power is calculated through angular resolution. Both methods of measurement can be correlated with each other and thus compared for equivalent resolution. Just as different sensors produce different resolutions, different types of film will also produce different resolutions. Roger N. Clark’s analysis of standard 35mm film showcased that depending on the type of film used, the resolution fell between 4 and 16 million pixels. For example, Clark’s study noted that Fujifilm’s Provia 100 film produced a resolution around 7 MP while Fujifilm’s Velvia 50 produced a resolution around 16 MP. Considering that entry cameras such as Nikon’s D3330 produce around 24 MP, 35mm film doesn’t have much of an advantage in this scenario. That being said, many professional photographers who shoot film opt to do so with medium or large formats. According to research carried out by a team of four industry experts, it was found that medium format film has a potential to capture a jaw- dropping 400 MP photograph, however, after digital scanning, resulted in a resolution of 50 to 80 MP. Another test, also conducted by Roger N. Clark, noted that larger formats such as 4×5 inches can capture 200 MP equivalent photographs after being scanned. In short, that 35mm film camera that you picked up from the flea market may not be able to outperform the latest digital cameras, but a medium format or large format unit can deliver and exceed the same resolution of Phase One’s latest $40,000 camera system.
  • 45. 41 Digital Noise / Film Grain : The random appearance of small textures within a photograph may be referred to as digital noise or film grain. With analog film, grain is the result of small chemical particles that have not received enough light. Within digital image sensors, noise is the result of unwanted signals created by the camera’s digital circuitry; this can be due to excess heat or a sensor’s ability to handle unruly signals in the airwaves. Increasing the ISO of a digital camera or selecting high-speed film will make your photographs more susceptible to noise and grain. In most situations, noise is unwanted in color photos; however, with black and white images, some artists view the grain as adding character, and thus not as a negative point. Testing by Magnetic Recording Technology Expert, Norman Koren, showcased that digital photography has evolved to the point at which it has far less noise than the equivalent available film speed. Of course, digital noise depends on the sensor within a digital camera, so older units may not be as efficient.
  • 46. 42 One last item to consider with noise/grain is that film may be a better medium for capturing long exposure photographs. Image sensors must be operated at low temperatures to avoid thermal noise, a process that can become difficult with prolonged usage of the imaging circuitry. Film, on the other hand, does not have any issues with overheating. Dynamic Range Once the almighty reason to shoot with analog film over digital, dynamic range is no longer the huge debate it once was in the past. While the dynamic range of an Image is a complex process that takes into account the sensor used, the type of file compression, and other factors, digital is ultimately winning against analog film.
  • 47. 43 A release by Kodak showcased that most film has around 13 stops of dynamic range. Today’s modern digital cameras all average around 14 stops of dynamic range, with high-end units such as the Nikon D810 reaching almost 15 stops. Film continuous to deliver incredible dynamic range, but today’s digital technology can easy match it. Independent testing of dynamic range on film cameras, such as the tests conducted by Roger N. Clark, showed that high-end digital cameras in 2005 began to show “huge dynamic range compared to [scans of] either print or slide film”. Films used in the testing included Kodak Gold 200 and Fujifilm FujiChrome Velvia. In additional, many digital cameras take advantage of sequential shots and HDR capabilities to create photographs with exceptional high dynamic range beyond what is capable with film. Film Speed When it comes to shooting in low light conditions, digital image sensors easily take the cake. Film can usually be found available in speeds between 100 and 3200, although 6400 film does exist. Today’s digital camera systems can match the noise produced by analog cameras in these ranges, as well as push their sensitivity many stops higher. Consumer digital cameras such as Fujifilm’s X100T can simulate sensitivities as high as ISO 51200 while professional Nikon systems, such as the D4s, can shoot as high as ISO 409,600. Digital cameras also have the advantage of being able to change film speeds between individual photographs. For most common roll films used today (135, 120, etc.), the
  • 48. 44 ISO is kept constant across the entire roll. The exception is with large format cameras that use one sheet at a time, and thus can be switched between shots. Analog film can be pushed or pulled multiple stops when needed, but the amount of contrast within the image is affected. Some photographers use this to their advantage to create the ideal look they desire, but this method still does not allow extremely high ISO speeds without impacting image tones. Cost and Convenience When it comes to cost and convenience, both digital and analog formats have their advantages and disadvantages. Noting the number of photographs you take within a given time, the urgency of needing an image available, and the type of subjects you shoot, will help you choose between the two options. Digital has a much more expensive up front cost and evolving technology means you will most likely want to upgrade your equipment within a few years. For those who demand instant access to their photographs, there is nothing faster and more convenient than digital. When shooting high-speed action photography, there is also no concern about running out of film; large memory cards can easilystore hundreds or thousands of high-resolution photographs. Analog is much more affordable up front, and you will most likely be able to use your film body for decades to come, as genuine enhancements are to the film itself. That being said, analog shooters will be spending a lot more money on film rolls and development costs. There is the need to conserve film more carefully as nothing can be just deleted as with digital and photos are not availably instantly. Most available processing labs take at least 24 hours, if not a few days, to complete the process. Sadly, one hour photography stores are a dying breed.
  • 49. 45 Let’s say that you want a modern digital camera with resolution, dynamic range, and grain equivalent to ISO 100 film. You may choose to pick up a Nikon D3300 – an entry camera that checks off all the boxes. The initial purchase may cost $500, but with a cheap memory card ($30) you can shoot unlimited photographs and delete what you don’t need. You may then opt to upgrade your camera within a five- year span for another $500. If you were to pick up a decent film camera for $150 and then shoot 100 photographs a month for a year, your total film costs would be around $260 (using Kodak Ektar 100 Pro) and your development costs would be around $370. Over a five-year span, you may not want to upgrade your camera, but total development costs and film would still amount around $3,200.
  • 50. 46 Conclusion: The digital revolution has caught up to film in many regards, killing many of the arguments for film being better than its technological counterpart. However, the most notable reason to shoot analog may be the resolution obtained from medium format cameras. Not all explanations can be laid within technical comparisons though. Many will argue that shooting analog is a more personal and enjoyable experience – that decision, is completely up to you. The inclusion of cameras in everything from cell phones to pens to children’s’ toys is possible because of the low cost and low power consumption of the imaging arrays that form the core of the cameras. However, these arrays are low cost and low power because they are CMOS-based; this allows for the devices to be made with the same processes and facilities that are used to make memory and computer chips. Yet, the continued surge in CMOS imager popularity goes beyond the lower cost to other factors such as ability to integrate the sensors with electronics, and the ability to achieve fast, customizable frame rates. People have been using camera and film for more than 100 years, both for still photography and movies. There is something magical about the process -- humans are visual creatures, and a picture really does paint a thousand words for us!
  • 51. 47 References : 1.Tom Harris-How Camera Works http://electronics.howstuffworks.com/camera.htm 2.Redditt Photo Class -http://www.r-photoclass.com/ 3.Image Acquisition- Springer 4. - http://electronics.howstuffworks.com/cameras-photography/digital/digital- camera.htm 5.Image Sensors-http://www.engineersgarage.com/articles/what-is-cmos-sensor 6.CCDvsCMOS-http://electronics.howstuffworks.com/cameras- photography/digital/question362.htm/printable 7.Film vs Digital-http://petapixel.com/2015/05/26/film-vs-digital-a-comparison-of- the-advantages-and-disadvantages/ 8.E.R Fossum,"CMOS Image Sensors: Electronic Camera on a Chip,"IEDM,pp.1.3.1- 1.3.9,Dec.1995. 9.B. Ackland and A. Dickinson, “Camera on a Chip”, in ISSCC Dig.,Feb. 1996, pp. 22-25. 10.M. A. Schuster and G. Strull, "A monolithic mosaic of photon sensors for solid state imaging applications," in Electron Devices Meeting, 1965 International, 1965, pp. 20- 21. 11.R. Melen, "The tradeoffs in monolithic image sensors; MOS vs CCD," Electronics, vol. 46,pp. 106-11, 1973.