SlideShare a Scribd company logo
1 of 106
COURSE MATERIAL
COURSE TITLE - FUNDAMENTALS OF REMOTE SENSING
COURSE CODE - GeES2081
Module Name and Code: SPATIAL DATA ACQUISITION AND ANALYSIS, GeESM2081
ADIGRAT UNIVERSITY
COLLEGE OF SOCIAL SCIENCE AND HUMANITIES
DEPARTMENT OF GEOGRAPHY AND ENVIRONMENTAL STUDIES
1
Edited By - Dr. Zubairul Islam, Associate Professor, Department of geography and Environmental studies.
zubairul@gmail.com
Compiled by : Dr. Zubairul Islam
COURSE CONTENT AND SCHEDULE OF ACTIVITIES
Week / day CONTENTS SLIDE NO.
1st Week CHAPTER ONE: Introduction 6-19
Day 1 1.1. Definition of remote sensing
1.2. Elements of remote sensing process
1.3. Advantage of Remote sensing
1.4. Application of Remote sensing
Day 2 CHAPTER TWO: Electromagnetic energy& remote sensing 20-38
2.1. Electromagnetic radiation
2.2. Electromagnetic spectrum
2.3. Energy interaction in the atmosphere
2.4. Energy interaction with the earth’s surface
Day 3&4 CHAPTER THREE: Satellites and images characteristics 39-67
3.1. Satellite Sensors and platforms
3.2 Satellite Characteristics: Orbits and Swaths
3.3. Satellite Image data characteristics
3.4. Image Specifications of Multispectral Satellite Sensors
2Compiled by : Dr. Zubairul Islam
2nd Week CHAPTER FOUR: DIGITAL IMAGE PROCESSING 68-99
Day 1 4.1 - Raster Bands
Day 1 4.2 - Software Demonstration: Arc GIS 9.3
Day 2 4.3 – General Exercises with Arc GIS 9.3
Lab 4.3.1 – Add Raster Data
Lab 4.3.2 - Clip image using
Lab 4.3.3 - Creating multiband dataset
Day 3 4.4 – Radiometric Corrections
Lab 4.4 .1 - Conversion of DN Values to
reflectance
Day 4 4.5 – Band Combinations
Lab 4.5.1 - Band Combinations
Day 4 4.6 – Image Classification & Land use mapping
Lab 4.6.1 - Image Classification
Day 5 4.7 – Land use Mapping
Lab 4.7.1 - Land use Mapping
3Compiled by : Dr. Zubairul Islam
3rd Week CHAPTER FIVE: GPS And Remote Sensing 100-106
Day 1 5.1. Introduction
Day 1 5.2. Satellite based positioning
Day 2 5.3. Lab - General Functions of Garmin GPS – 60
Day 3 5.4. Lab – Create Waypoints
4Compiled by : Dr. Zubairul Islam
RECOMMENDED MODE OF ASSESSMENT
Continuous assessment………………… (60%)
 Test 1& 2…10 + 10………1st week Friday After noon & Saturday
Morning Shift – Ch 1,2&3
 Lab 1& 2… 10 + 10………2nd week Friday After noon & Saturday
Morning Shift– Ch 4
 Group project work …..20…..3rd week Friday for submission –
Ch 5
Final exam…………………………….40% …………
20% from chapter 1 to 3 & 20 % from chapter 4&5
5Compiled by : Dr. Zubairul Islam
Chapter One: Introduction
1.1. Definition of remote sensing
1.2 Elements of remote sensing process
1.3 Advantages of Remote Sensing
1.4 Applications of Remote Sensing
6Compiled by : Dr. Zubairul Islam
1.1. Definition of remote sensing
"Remote sensing is the science (and to some extent,
art) of acquiring information about the Earth's
surface without actually being in contact with it. This
is done by sensing and recording reflected or emitted
energy and processing, analyzing, and applying that
information."
7Compiled by : Dr. Zubairul Islam
1.2 Elements of remote sensing process
There are seven elements comprise the remote sensing process from beginning to end.
1. Energy Source or Illumination (A) - the
first requirement for remote sensing is to have
an energy source which illuminates or
provides electromagnetic energy to the target
of interest.
2. Radiation and the Atmosphere (B) – as
the energy travels from its source to the
target, it will come in contact with and interact
with the atmosphere it passes through. This
interaction may take place a second time as
the energy travels from the target to the
3. Interaction with the Target (C) - once
the energy makes its way to the target
through the atmosphere, it interacts
with the target depending on the
properties of both the target and the
radiation.
4. Recording of Energy by the Sensor (D) -
after the energy has been scattered by, or
emitted from the target, we require a sensor
(remote - not in contact with the target) to
collect and record the electromagnetic
radiation.
8Compiled by : Dr. Zubairul Islam
5. Transmission, Reception, and
Processing (E) - the energy
recorded by the sensor has
to be transmitted, often in
electronic form, to a receiving and
processing station where the data
are processed into an image
(hardcopy and/or digital).
7. Application (G) - the final element of the
remote sensing process is achieved when we
apply the information we have been able to
extract from the imagery about the target in
order to better understand it, reveal some new
information, or assist in solving a particular
problem.
6. Interpretation and Analysis (F)
- the processed image is
interpreted, visually and/or
digitally or electronically, to
extract information about the
target which was illuminated.
9Compiled by : Dr. Zubairul Islam
1.3 Advantages of Remote Sensing
1. Enables to observe a broad area at a time
2. Enables to observe the area for a long period --
Repeat pass observation (Time series data, Change
detection)
3. Enables to know the condition without visiting the
area
4. Enables to know invisible information – Sensors for
various electromagnetic spectrum (Infrared,
microwave)
10Compiled by : Dr. Zubairul Islam
1.2. Applications of Remote Sensing
Remote sensing has enabled mapping, studying, monitoring and
management of various resources like agriculture, forestry, geology, water,
ocean etc. It has further enabled monitoring of environment and thereby
helping in conservation. In the last four decades it has grown as a major
tool for collecting information on almost every aspect on the earth. With
the availability of very high spatial resolution satellites in the recent years,
the applications have multiplied.
11Compiled by : Dr. Zubairul Islam
Current major applications of remote sensing may as following:
1. Geology
2. Forestry
3. Change detection
4. Oceanography
5. Meteorology
6. Land degradation
7. Land use applications
12Compiled by : Dr. Zubairul Islam
Geology
In geology, for instance, remote sensing can be applied to analyze
and map large, remote areas. Remote sensing interpretation also
makes it easy for geologists in this case to identify an area's rock
types, geomorphology, and changes from natural events such as a
flood or landslide.
13Compiled by : Dr. Zubairul Islam
Forestry
Satellite imagery is used to identify and map: -
• The species of native and exotic forest trees.
• The effects of major diseases or adverse change in environmental conditions.
• The geographic extent of forests. etc
14Compiled by : Dr. Zubairul Islam
Satellite imagery is not always able to provide exact details about
the species or age of vegetation. However, the imagery provides a
very good means of measuring significant change in vegetation
cover, whether it is through clearing, wildfire damage or
environmental stress. The most common form of environmental
stress is water deficiency.
Change detection
15Compiled by : Dr. Zubairul Islam
Oceanography
Remote sensing is applied to oceanography studies. Remote
sensing is used, for example, to measure sea surface temperature
and to monitor marine habitats.
16Compiled by : Dr. Zubairul Islam
Meteorology
Remote sensing is an effective method for mapping cloud type and
extent, and cloud top temperature.
17Compiled by : Dr. Zubairul Islam
Land degradation
Imagery can be used to map areas of poor or no vegetation cover. A
range of factors, including saline or sodic soils, and overgrazing, can
cause degraded landscapes.
18Compiled by : Dr. Zubairul Islam
land use applications
Additionally, those studying urban and other land use applications are also concerned
with remote sensing because it allows them to easily pick out which land uses are
present in an area. This can then be used as data in city planning applications and the
study of species habitat, for example.
19Compiled by : Dr. Zubairul Islam
Chapter Two
Electromagnetic energy and remote sensing
2.1. Electromagnetic radiation
2.2. Electromagnetic spectrum
2.3 Energy interaction in the atmosphere
2.4 Energy interaction with the earth’s surface
20Compiled by : Dr. Zubairul Islam
The first requirement for remote sensing is to have an
energy source to illuminate the target (unless the
sensed energy is being emitted by the target). This
energy is in the form of electromagnetic radiation.
2.1 Electromagnetic Radiation
Two characteristics of electromagnetic radiation are
particularly important for understanding remote
sensing. These are the wavelength and frequency.
The wavelength is the length of one wave cycle, which can be
measured as the distance between successive wave crests.
Wavelength is usually represented by the Greek letter lambda
(λ). Wavelength is measured in metres (m) or some factor of
metres such as nanometres (nm, 10-9 metres), micrometres
(μm, 10-6 metres) (μm, 10-6 metres) or centimetres (cm, 10-2
metres). Frequency refers to the number of cycles of a wave
passing a fixed point per unit of time.
wavelength and frequency are inversely related to each other. The shorter the
wavelength, the higher the frequency.
21Compiled by : Dr. Zubairul Islam
2.2 - The Electromagnetic Spectrum
The electromagnetic (EM) spectrum is the continuous range of
electromagnetic radiation, extending from gamma rays (highest
frequency & shortest wavelength) to radio waves (lowest frequency
& longest wavelength) and including visible light.
The EM spectrum can be divided into seven different regions —
gamma rays, X-rays, ultraviolet, visible light, infrared, microwaves
and radio waves.
22Compiled by : Dr. Zubairul Islam
Remote sensing involves the measurement of energy in
many parts of the electromagnetic (EM) spectrum. The
major regions of interest in satellite sensing are visible
light, reflected and emitted infrared, and the microwave
regions. The measurement of this radiation takes place
in what are known as spectral bands. A spectral band is
defined as a discrete interval of the EM spectrum.
For example the wavelength range of 0.4μm to 0.5μm
(μm = micrometers or 10-6m) is one spectral band.
23Compiled by : Dr. Zubairul Islam
Satellite sensors have been designed to measure
responses within particular spectral bands to enable the
discrimination of the major Earth surface materials.
Scientists will choose a particular spectral band for data
collection depending on what they wish to examine.
The design of satellite sensors is based on the absorption
characteristics of Earth surface materials across all the
measurable parts in the EM spectrum.
24Compiled by : Dr. Zubairul Islam
Visible Spectrum (1)
The light which our eyes - our "remote sensors" - can
detect is part of the visible spectrum.
It is important to recognize how small the visible portion
is relative to the rest of the spectrum.
There is a lot of radiation around us which is invisible to
our eyes, but can be detected by other remote sensing
instruments and used to our advantage.
It is important to note that this is the only
portion of the EM spectrum we can associate
with the concept of colours.
25Compiled by : Dr. Zubairul Islam
Blue, green, and red are the primary colors or
wavelengths of the visible spectrum.
They are defined as such because no single primary color
can be created from the other two, but all other colors
can be formed by combining blue, green, and red in
various proportions.
Although we see sunlight as a uniform or homogeneous
color, it is actually composed of various wavelengths of
radiation in primarily the ultraviolet, visible and infrared
portions of the spectrum.
The visible portion of this radiation can be shown in its
component colors when sunlight is passed through a
prism.
26Compiled by : Dr. Zubairul Islam
Infrared (IR) Region
The IR Region covers the wavelength
range from approximately 0.7 μm to
100 μm - more than 100 times as
wide as the visible portion!
The IR region can be divided into
two categories based on their
radiation properties - the reflected
IR, and the emitted or thermal IR.
27Compiled by : Dr. Zubairul Islam
Reflected and Thermal IR
Radiation in the reflected IR region is used for remote
sensing purposes in ways very similar to radiation in the
visible portion. The reflected IR covers wavelengths from
approximately 0.7 μm to 3.0 μm.
The thermal IR region is quite different than the visible
and reflected IR portions, as this energy is essentially the
radiation that is emitted from the Earth's surface in the
form of heat. The thermal IR covers wavelengths from
approximately 3.0 μm to 100 μm.
28Compiled by : Dr. Zubairul Islam
Microwave Region
The portion of the spectrum of more
recent interest to remote sensing is
the microwave region from about 1
mm to 1 m.
This covers the longest wavelengths
used for remote sensing.
The shorter wavelengths have
properties similar to the thermal
infrared region while the longer
wavelengths approach the
wavelengths used for radio
broadcasts.
29Compiled by : Dr. Zubairul Islam
30Compiled by : Dr. Zubairul Islam
2.3 Interactions with the Atmosphere
Before radiation used for remote sensing reaches the
Earth's surface it has to travel through some distance of
the Earth's atmosphere. Particles and gases in the
atmosphere can affect the incoming light and radiation.
These effects are caused by the mechanisms of
Scattering and absorption.
Scattering occurs when particles or large gas molecules present in the atmosphere
interact with and cause the electromagnetic radiation to be redirected from its original
path. How much scattering takes place depends on several factors including the wavelength
of the radiation, the abundance of particles or gases, and the distance the radiation travels
through the atmosphere. There are three (3) types of scattering which take place.
Rayleigh scattering occurs when particles are very small compared to the wavelength of
the radiation. These could be particles such as small specks of dust or nitrogen and oxygen
molecules. Rayleigh scattering causes shorter wavelengths of energy to be scattered much
more than longer wavelengths. Rayleigh scattering is the dominant scattering mechanism
in the upper atmosphere. The fact that the sky appears "blue" during the day is because of
this Phenomenon.
Red color at sunrise and sunset is also because of this phenomena.
Scattering
31Compiled by : Dr. Zubairul Islam
Mie scattering occurs when the particles are just about the same size as the wavelength
of the radiation. Dust, pollen, smoke and water vapour are common causes of Mie
scattering which tends to affect longer wavelengths than those affected by Rayleigh
scattering. Mie scattering occurs mostly in the lower portions of the atmosphere where
larger particles are more abundant, and dominates when cloud conditions are overcast.
The final scattering mechanism of importance is called
nonselective scattering. This occurs when the particles
are much larger than the wavelength of the radiation.
Water droplets and large dust particles can cause this type
of scattering.
Nonselective scattering gets its name from the fact that all
wavelengths are scattered about equally.
This type of scattering causes fog and clouds to appear
white to our eyes because blue, green, and red light are all
scattered in approximately equal quantities
(blue+green+red light = white light).
32Compiled by : Dr. Zubairul Islam
Absorption is the other main mechanism at work
when electromagnetic radiation interacts with the atmosphere. In contrast to scattering,
this phenomenon causes molecules in the atmosphere to absorb energy at various
wavelengths. Ozone, carbon dioxide, and water vapour are the three main atmospheric
constituents which absorb radiation.
Ozone serves to absorb the harmful (to most living things) ultraviolet radiation from
the sun. Without this protective layer in the atmosphere our skin would burn when
exposed to sunlight.
carbon dioxide referred to as a greenhouse gas. This is because it tends to absorb
radiation strongly in the far infrared portion of the spectrum - that area associated with
thermal heating - which serves to trap this heat inside the atmosphere. Water vapour in
the atmosphere absorbs much of the incoming longwave infrared and shortwave
microwave radiation (between 22μm and 1m). The presence of water vapour in the
lower atmosphere varies greatly from location to location and at different times of the
year. For example, the air mass above a desert would have very little water vapour to
absorb energy, while the tropics would have high concentrations of water vapour (i.e.
high humidity).
Absorption
33Compiled by : Dr. Zubairul Islam
Because these gases absorb electromagnetic energy in very specific
regions of the spectrum, they influence where (in the spectrum)
we can "look" for remote sensing purposes. Those areas of the
spectrum which are not severely influenced by atmospheric
absorption and thus, are useful to remote sensors, are called
atmospheric windows.
By comparing the characteristics of the two most common
energy/radiation sources (the sun and the earth) with the
atmospheric windows available to us, we can define those
wavelengths that we can use most effectively for remote sensing.
The visible portion of the spectrum, to which our eyes are most
sensitive, corresponds to both an atmospheric window and the
peak energy level of the sun. Note also that heat energy emitted by
the Earth corresponds to a window around 10 μm in the thermal IR
portion of the spectrum, while the large window at wavelengths
beyond 1 mm is associated with the microwave region.
34Compiled by : Dr. Zubairul Islam
2.4 Energy interaction with the earth’s surface
Radiation that is not absorbed or scattered in the atmosphere
can reach and interact with the Earth's surface. There are three
(3) forms of interaction that can take place when energy
strikes, or is incident (I) upon the surface.
These are: absorption (A); transmission (T); and reflection (R).
The total incident energy will interact with the surface in one
or more of these three ways. The proportions of each will
depend on the wavelength of the energy and the material and
condition of the feature.
Absorption (A) occurs when radiation (energy) is absorbed into
the target while transmission (T) occurs when radiation passes
through a target. Reflection (R) occurs when radiation
"bounces" off the target and is redirected. In remote sensing,
we are most interested in
measuring the radiation reflected from targets. We refer to two
types of reflection, which represent the two extreme ends of
the way in which energy is reflected from a target: specular
reflection and diffuse reflection.
35Compiled by : Dr. Zubairul Islam
Let's take a look at a couple of examples of targets at the Earth's surface and
how energy at the visible and infrared wavelengths interacts with them.
Leaves: A chemical compound in leaves called chlorophyll strongly absorbs radiation in
the red and blue wavelengths but reflects green wavelengths. Leaves appear "greenest“
to us in the summer, when chlorophyll content is at its maximum. In autumn, there is
less chlorophyll in the leaves, so there is less absorption and proportionately more
reflection of the red wavelengths, making the leaves appear red or yellow (yellow is a
combination of red and green wavelengths). The internal structure of healthy leaves act
as excellent diffuse reflectors of near-infrared wavelengths. If our eyes were sensitive to
near-infrared, trees would appear extremely bright to us at these wavelengths. In fact,
measuring and monitoring the near-IR reflectance is one way that scientists can
determine how healthy (or unhealthy) vegetation may be.
36Compiled by : Dr. Zubairul Islam
Water: Longer wavelength visible and near infrared radiation is absorbed more by water
than shorter visible wavelengths. Thus water typically looks blue or blue-green due to
stronger reflectance at these shorter wavelengths, and darker if viewed at red or near
infrared wavelengths. If there is suspended sediment present in the upper layers of the
water body, then this will allow better reflectivity and a brighter appearance of the water.
The apparent colour of the water will show a slight shift to longer wavelengths.
Suspended sediment (S) can be easily confused with shallow (but clear) water, since these
two phenomena appear very similar. Chlorophyll in algae absorbs more of the blue
wavelengths and reflects the green, making the water appear more green in colour when
algae is present. The topography of the water surface (rough, smooth, floating materials,
etc.) can also lead to complications for water-related interpretation due to potential
problems of specular reflection and other influences on colour and brightness.
37Compiled by : Dr. Zubairul Islam
We can see from these examples that, depending on the complex make-up of
the target that is being looked at, and the wavelengths of radiation involved, we
can observe very different responses to the mechanisms of absorption,
transmission, and reflection. By measuring the energy that is reflected (or
emitted) by targets on the Earth's surface over a variety of different
wavelengths, we can build up a spectral response for that object. By comparing
the response patterns of different features we may be able to distinguish
between them, where we might not be able to, if we only compared them at
one wavelength. For example, water and vegetation may reflect somewhat
similarly in the visible wavelengths but are almost always separable in the
infrared. Spectral response can be quite variable, even for the same target type,
and can also vary with time (e.g. "green-ness" of leaves) and location. Knowing
where to "look" spectrally and understanding the factors which influence the
spectral response of the features of interest are critical to correctly interpreting
the interaction of electromagnetic
radiation with the surface.
38Compiled by : Dr. Zubairul Islam
CHAPTER THREE: SATELLITE AND IMAGES CHARACTERISTICS
3.1. Satellite Sensors and platforms
3.1.1 - Airborne remote sensing
3.1.2. Space borne remote sensing
3.2 Satellite Characteristics: Orbits and Swaths
3.3. Satellite Image data characteristics
3.4. Image Specifications of Multispectral Satellite Sensors
39Compiled by : Dr. Zubairul Islam
A sensor is a device that measures and records electromagnetic
energy.
3.1. Satellite Sensors and platforms
Sensor
Sensors can be divided into two groups.
1. Passive sensors depend on an external source of energy,
usually the sun. Most of the satellite sensors are passive.
2. Active sensors have their own source of energy, an example
would be a radar gun. These sensors send out a signal and
measure the amount reflected back. Active sensors are more
controlled because they do not depend upon varying
illumination conditions
40Compiled by : Dr. Zubairul Islam
In order for a sensor to collect and record energy reflected or
emitted from a target or surface, it must reside on a stable
platform removed from earth’s surface being observed. Platforms
for remote sensors may be situated on the ground, on an aircraft or
balloon (or some other platform within the Earth's atmosphere), or
on a spacecraft or satellite outside of the Earth's atmosphere.
Platforms
41Compiled by : Dr. Zubairul Islam
3.1.1 - Airborne remote sensing
Airborne remote sensing is the oldest and most widely used method
of remote sensing. Cameras mounted in light aircraft flying
between 200 and 15,000 m capture a large quantity of detailed
information. Aerial photos provide an instant visual inventory of a
portion of the earth's surface and can be used to create detailed
maps. Aerial photographs commonly are taken by commercial
aerial photography firms which own and operate specially modified
aircraft equipped with mapping quality cameras.
Camera and platform configurations can be grouped in terms of
oblique and vertical. Oblique aerial photography is taken at an
angle to the ground. The resulting images give a view as if the
observer is looking out an airplane window. These images are
easier to interpret than vertical photographs, but it is difficult to
locate and measure features on them for mapping purposes.
42Compiled by : Dr. Zubairul Islam
Vertical aerial photography is taken with the camera pointed straight
down. The resulting images depict ground features in plan form and
are easily compared with maps. Vertical aerial photos are always
highly desirable, but are particularly useful for resource surveys in
areas where no maps are available. Aerial photos depict features
such as field patterns and vegetation which are often omitted on
maps. Comparison of old and new aerial photos can also capture
changes within an area over time.
Vertical aerial photos contain subtle displacements due to relief, tip
and tilt of the aircraft and lens distortion. Vertical images may be
taken with overlap, typically about 60 percent along the flight line
and at least 20 percent between lines. Overlapping images can be
viewed with a stereoscope to create a three-dimensional view, called
a stereo model.
43Compiled by : Dr. Zubairul Islam
3.1.2. Space borne remote sensing
Photography has proven to be an important input to visual interpretation and the
production of analog maps. However, the development of satellite platforms, the
associated need to telemeter imagery in digital form, and the desire for highly
consistent digital imagery have given rise to the development of solid state
scanners as a major format for the capture of remotely sensed data.
The basic logic of a scanning sensor is the use of a mechanism to sweep a small
field of view (known as an instantaneous field of view—IFOV) in a west to east
direction at the same time the satellite is moving in a north to south direction.
Together this movement provides the means of composing a complete raster
image of the environment.
A simple scanning technique is to use a rotating mirror that can sweep the field
of view in a consistent west to east fashion. The field of view is then intercepted
with a prism that can spread the energy contained within the IFOV into its
spectral components. Photoelectric detectors (of the same nature as those found
in the exposure meters of commonly available photographic cameras) are then
arranged in the path of this spectrum to provide electrical measurements of the
amount of energy detected in various parts of the electromagnetic spectrum.
44Compiled by : Dr. Zubairul Islam
As the scan moves from west to east, these detectors are polled to
get a set of readings along the east-west scan. These form the
columns along one row of a set of raster images—one for each
detector. Movement of the satellite from north to south then
positions the system to detect the next row, ultimately leading to
the production of a set of raster images as a record of reflectance
over a range of spectral bands.
There are several satellite systems in operation today that collect
imagery that is subsequently distributed to users. Several of the
most common systems are described below. Each type of satellite
data offers specific characteristics that make it more or less
appropriate for a particular application.
45Compiled by : Dr. Zubairul Islam
3.2 Satellite Characteristics: Orbits and Swaths
Satellites have several unique characteristics which make them particularly useful
for remote sensing of the Earth's surface. The path followed by a satellite is
referred to as its orbit. Satellite orbits are matched to the capability and objective
of the sensor(s) they carry. Orbit selection can vary in terms of altitude (their height
above the Earth's surface) and their orientation and rotation relative to the Earth.
Geostationary orbits
Satellites at very high altitudes, which view the same portion of the Earth's surface
at all times have geostationary orbits. These geostationary satellites revolve at
speeds which match the rotation of the Earth so they seem stationary, relative to the
Earth's surface. This allows the satellites to observe and collect information
continuously over specific areas. Weather and communications satellites commonly
have these types of orbits. Due to their high altitude, some geostationary weather
satellites can monitor weather and cloud patterns covering an entire hemisphere of
the Earth.
46Compiled by : Dr. Zubairul Islam
Sun-synchronous
Many remote sensing platforms are designed to follow an orbit
(basically north-south) which, in conjunction with the Earth's
rotation (west-east), allows them to cover most of the Earth's
surface over a certain period of time. These are nearpolar orbits,
so named for the inclination of the orbit relative to a line running
between the North and South poles. These satellite orbits are
known as sun-synchronous.
47Compiled by : Dr. Zubairul Islam
As a satellite revolves around the Earth, the sensor "sees" a certain portion of the
Earth's surface. The area imaged on the surface, is referred to as the swath.
Imaging swaths for space borne sensors generally vary between tens and
hundreds of kilometers wide. As the satellite orbits the Earth from pole to pole,
its east-west position wouldn't change if the Earth didn't rotate.
However, as seen from the Earth, it seems that the satellite is shifting westward
because the Earth is rotating (from west to east) beneath it. This apparent
movement allows the satellite swath to cover a new area with each consecutive
pass. The satellite's orbit and the rotation of the Earth work together to allow
complete coverage of the Earth's surface, after it has completed one complete
cycle of orbits.
Swath
48Compiled by : Dr. Zubairul Islam
49Compiled by : Dr. Zubairul Islam
3.3. Satellite Image data characteristics
Image data characteristics can be explained under three
categories as follows:
1. The spatial resolution,
2. Spectral resolution
3. Radiometric resolution
4. Temporal resolution
50Compiled by : Dr. Zubairul Islam
3.3.1- Spatial Resolution, Pixel size and scale
The detail discernible in an image is dependent on the spatial
resolution of the sensor and refers to the size of the smallest
possible feature that can be detected. Spatial resolution of
passive sensors (we will look at the special case of active
microwave sensors later) depends primarily on their
Instantaneous Field of View (IFOV).
The IFOV is the angular cone of visibility of the
sensor (A) and determines the area on the Earth's
surface which is "seen" from a given altitude at
one particular moment in time (B). The size of
the area viewed is determined by multiplying the
IFOV by the distance from the ground to the
sensor (C). This area on the ground is called the
resolution cell and determines a sensor's
maximum spatial resolution. 51Compiled by : Dr. Zubairul Islam
For a homogeneous feature to be detected, its size generally has to be equal to or
larger than the resolution cell. If the feature is smaller than this, it may not be
detectable as the average brightness of all features in that resolution cell will be
recorded. However, smaller features may sometimes be detectable if their
reflectance dominates within a particular resolution cell allowing sub-pixel or
resolution cell detection.
Most remote sensing images are composed of a matrix of picture elements, or
pixels, which are the smallest units of an image. Image pixels are normally square
and represent a certain area on an image. It is important to distinguish between
pixel size and spatial resolution - they are not interchangeable. If a sensor has a
spatial resolution of 20 metres and an image from that sensor is displayed at full
resolution, each pixel represents an area of 20m x 20m on the ground. In this case
the pixel size and resolution are the same. However, it is possible to display an
image with a pixel size different than the resolution.
PIXEL
52Compiled by : Dr. Zubairul Islam
Images where only large features are visible are said to have
coarse or low resolution. In fine or high resolution images,
small objects can be detected. Military sensors for example, are
designed to view as much detail as possible, and therefore have
very fine resolution. Commercial satellites provide imagery with
resolutions varying from a few metres to several kilometers.
Generally speaking, the finer the resolution, the less total ground
area can be seen.
The ratio of distance on an image or map, to actual ground
distance is referred to as scale. If you had a map with a scale of
1:100,000, an object of 1cm length on the map would actually be
an object 100,000cm (1km) long on the ground. Maps or images
with small "map-to-ground ratios" are referred to as small scale
(e.g. 1:100,000), and those with larger ratios (e.g. 1:5,000) are
called large scale.
SCALE
53Compiled by : Dr. Zubairul Islam
3.3.2- Spectral Resolution
Different classes of features and details in an image can often be
distinguished by comparing their responses over distinct
wavelength ranges. Broad classes, such as water and vegetation,
can usually be separated using very broad wavelength ranges - the
visible and near infrared other more specific classes, such as
different rock types, may not be easily distinguishable using either
of these broad wavelength ranges and would require comparison at
much finer wavelength ranges to separate them. Thus, we would
require a sensor with higher spectral resolution. Spectral
resolution describes the ability of a sensor to define fine
wavelength intervals. The finer the spectral resolution, the narrower
the wavelength ranges for a particular channel or band.
54Compiled by : Dr. Zubairul Islam
Black and white film records wavelengths extending over much,
or all of the visible portion of the electromagnetic spectrum. Its
spectral resolution is fairly coarse, as the various wavelengths of
the visible spectrum are not individually distinguished and the
overall reflectance in the entire visible portion is recorded.
Color film is also sensitive to the reflected energy over the visible
portion of the spectrum, but has higher spectral resolution, as it is
individually sensitive to the reflected energy at the blue, green,
and red wavelengths of the spectrum. Thus, it can represent
features of various colors based on their reflectance in each of
these distinct wavelength ranges.
55Compiled by : Dr. Zubairul Islam
Many remote sensing systems record energy over several separate
wavelength ranges at various spectral resolutions. These are
referred to as multi-spectral sensors and will be described in
some detail in following sections. Advanced multi-spectral
sensors called hyperspectral sensors, detect hundreds of very
narrow spectral bands throughout the visible, near-infrared, and
mid-infrared portions of the electromagnetic spectrum. Their very
high spectral resolution facilitates fine discrimination between
different targets based on their spectral response in each of the
narrow bands.
56Compiled by : Dr. Zubairul Islam
3.3.3- Radiometric Resolution
The radiometric characteristics describe the actual information
content in an image. Every time an image is acquired on film or
by a sensor, its sensitivity to the magnitude of the electromagnetic
energy determines the radiometric resolution. The radiometric
resolution of an imaging system describes its ability to
discriminate very slight differences in energy the finer the
radiometric resolution of a sensor, the more sensitive it is to
detecting small differences in reflected or emitted energy.
57Compiled by : Dr. Zubairul Islam
Imagery data are represented by positive digital numbers which vary from 0 to
(one less than) a selected power of 2. This range corresponds to the number of
bits used for coding numbers in binary format. Each bit records an exponent of
power 2 (e.g. 1 bit=2 1=2). The maximum number of brightness levels available
depends on the number of bits used in representing the energy recorded.
Thus, if a sensor used 8 bits to record the data, there would be 28=256 digital
values available, ranging from 0 to 255.
However, if only 4 bits were used, then only 24=16 values ranging from 0 to 15
would be available. Thus, the radiometric resolution would be much less.
Image data are generally displayed in a range of grey tones, with black
representing a digital number of 0 and white representing the maximum value
(for example, 255 in 8-bit data). By comparing a 2-bit image with an 8-bit
image, we can see that there is a large difference in the level of detail discernible
depending on their radiometric resolutions.
58Compiled by : Dr. Zubairul Islam
In addition to spatial, spectral, and radiometric resolution, the concept of
temporal resolution is also important to consider in a remote sensing
system.
The revisit period of a satellite sensor is usually several days. Therefore
the absolute temporal resolution of a remote sensing system to image the
exact same area at the same viewing angle a second time is equal to this
period.
The time factor in imaging is important when:
• Persistent clouds offer limited clear views of the Earth's surface (often
in the tropics)
• Short-lived phenomena (floods, oil slicks, etc.) need to be imaged
• Multi-temporal comparisons are required (e.g. the spread of a forest
disease from one year to the next)
• The changing appearance of a feature over time can be used to
distinguish it from near-similar features (wheat / maize)
3.3.4- Temporal Resolution
59Compiled by : Dr. Zubairul Islam
3.4. Image Specifications of Multispectral Satellite Sensors
The Landsat program is the longest running enterprise for acquisition of satellite
imagery of Earth. On July 23, 1972 the Earth Resources Technology Satellite was
launched. This was eventually renamed to Landsat. The most recent, Landsat 8,
was launched on February 11, 2013.
Landsat sensors collected data over a swath width of 185 km, with a full scene
being defined as 185 km x 185 km.
The instruments on the Landsat satellites have acquired millions of images. The
images, archived in the United States and at Landsat receiving stations around the
world, are a unique resource for global change research and applications in
agriculture, cartography, geology, forestry, regional planning, surveillance and
education, and can be viewed through the USGS 'Earth Explorer' website.
Landsat
60Compiled by : Dr. Zubairul Islam
Landsat: Spectral Bands Characteristics
Ba
nd
EMS * About
1 Blue light
scattered by the atmosphere and illuminates material in shadows better than longer
wavelengths; penetrates clear water better than other colors; absorbed by chlorophyll,
so plants don’t show up very brightly in this band; useful for soil/vegetation
discrimination, forest type mapping, and identifying man-made features
2 Green light
penetrates clear water fairly well, gives excellent contrast between clear and turbid
(muddy) water; helps find oil on the surface of water, and vegetation (plant life);
reflects more green light than any other visible color; man-made features are still
visible
3 Red light
limited water penetration; reflects well from dead foliage, but not well from live
foliage with chlorophyll; useful for identifying vegetation types, soils, and urban (city
and town) features
4 Near IR (NIR)
good for mapping shorelines and biomass content; very good at detecting and
analyzing vegetation
5
Shortwave IR
(SWIR)
limited cloud penetration; provides good contrast between different types of
vegetation; useful for measuring the moisture content of soil and vegetation; helps
differentiate between snow and clouds
6
Thermal IR
(TIR or LWIR)
useful to observe temperature and its effects, such as daily and seasonal variations;
useful to identify some vegetation density, moisture, and cover type; ETM+ TIR has
60-meter pixels; TIR pixels on Landsat-5 are 120 meters
7 Another SWIR
limited cloud penetration; provides good contrast between different types of
vegetation; useful for measuring the moisture content of soil and vegetation; helps
differentiate between snow and clouds
8 Panchromatic (“pan”) on Landsat 7 only, has 15 m resolution, used to “sharpen” images
* Electro magnetic spectrum 61Compiled by : Dr. Zubairul Islam
Data
Product
Provider Orbital
Height
Spatial
Resolution
Swath
Width
Pass Over
Time
Date Range of
Acquisition
Spectral Coverage Data Use
Landsat
5 TM
NASA/
USGS
705 km 30 m 185 km Every 16
days
since March 1,
1984
BAND 1:
0.45-0.52 µm
(30 m)
oceanography, aerosols,
bathymetry, vegetation
types, peak vegetation,
biomass content
analysis, moisture
analysis, thermal
mapping, mineral
deposit identification
Sun-
synchro
nous
120 m Equator at
~09h45
(local time)
Note: First
Landsat
Mission in
1972
BAND 2:
0.52-0.60 µm
(30 m)
BAND 3:
0.63-0.69 µm
(30 m)
BAND 4:
0.76-0.90 µm
(30 m)
BAND 5:
1.55-1.75 µm
(30 m)
BAND 6:
10.4-12.5 µm
(120 m) (IR)
BAND 7:
08-2.35 µm
(30 m)
Landsat 5 TM : Specifications
62Compiled by : Dr. Zubairul Islam
Data
Product
Provider Orbital
Height
Spatial
Resolution
Swath
Width
Pass Over
Time
Date Range
of
Acquisition
Spectral
Coverage
Data Use
Landsat
7 ETM+
NASA/USGS 705 km 15 m 183 km Every 16
days
since April
15, 1999
B1:
0.45-0.515 µm
(30 m)
oceanography,
aerosols,
bathymetry,
vegetation
types, peak
vegetation,
biomass
content
analysis,
moisture
analysis,
thermal
mapping,
mineral
deposit
identification
Sun-
synchronou
s
30 m Equator at
~10h00
(local time)
B2:
0.525-0.605 µm
(30 m)
60 m B3:
0.63-0.69 µm
(30 m)
B4:
0.75-0.90 µm
(30 m)
B5:
1.55-1.75 µm
(30 m)
B6:
10.4-12.5 µm
(60 m)
B7:
2.09-2.35 µm
(30 m)
B8:
0.52-0.9 µm (
15 m)
Landsat 7 ETM+ - Specifications
PLS NOTE : B1: Band 1, B2: Band 2, B3: Band 3, B4: Band 4, B5: Band 5, B6: Band 6, B7: Band 7,
63Compiled by : Dr. Zubairul Islam
Data ProductProvider Orbital
Height
Spatial
Resolution
Swath
Width
Pass Over
Time
Date Range of
Acquisition
Spectral Coverage Data Use
Landsat 8 NASA/USGS 705 km 15 m 185 km Every 16
days
since B1: 0.433–0.453 µm
(30 m)
oceanography,
aerosols,
bathymetry,
vegetation types,
peak vegetation,
biomass content
analysis, moisture
analysis, cloud
cover analysis,
thermal mapping,
soil moisture
estimation
OLI (B1-B9)
Sun-
synchronous
30 m Equator at
~10h00 (LT)
11-Feb-13 B2: 0.450–0.515 µm
(30 m)
TIRS (B10-B11)
60 m B3: 0.525–0.600 µm
(30 m)
100 m B4: 0.630–0.680 µm
(30 m)
B5: 0.845–0.885 µm
(30 m)
B6: 1.560–1.660 µm
(60 m)
B7: 2.100–2.300 µm
(30 m)
B8: 0.500–0.680 µm
(15 m)
B9: 1.360–1.390 µm
(30 m)
B10: 10.6-11.2 µm
(100 m) (IR)
B11: 11.5-12.5 µm
(100 m) (IR)
PLS NOTE : B1: Band 1, B2: Band 2, B3: Band 3, B4: Band 4, B5: Band 5, B6: Band 6, B7: Band 7, B8: Band 8
Landsat 8 - Specifications
64Compiled by : Dr. Zubairul Islam
65Compiled by : Dr. Zubairul Islam
Data Product Provider Orbital
Height
Spatial
Resolution
Swath Width Pass Over
Time
Date Range of
Acquisition
Spectral
Coverage
Data Use
QuickBird Digital
Globe
482 km 65 cm
B/W
16.8 km – Every 2.4-
5.9 days,
equator
at 10h30
(local
time)
since
October 18,
2001
B/W: 405-
1053 nm
mapping,
change
detection,
planning
(engineering,
natural
resources,
urban,
infrastructure)
, land-use, EIA,
tourism,
military, crop
management,
environmental
monitoring
450 km 2.62 m
RGBiR
18 km R: 430 -
545 nm
Sun-
synchron
ous
61 cm
B/W
G: 466 -
620 nm
2.44 m
RGBiR
B: 590 -
710 nm
NIR: 715 -
918 nm
QuickBird
66Compiled by : Dr. Zubairul Islam
Data
Product
Provider Orbital
Height
Spatial
Resolution
Swath
Width
Pass Over
Time
Date Range
of
Acquisition
Spectral
Coverage
Data Use
Ikonos Digital
Globe
681 km 80 cm B/W 11.3 km every 3
days
launch
September
24, 1999
B/W: 445-
900 nm
mapping,
change
detection,
planning
(engineering,
natural
resources,
urban,
infrastructure),
land-use, EIA,
tourism,
military, crop
management,
environmental
monitoring
Sun-
synchron
ous
3.2 m
RBGiR
R: 445-
516 nm
B: 506-
595 nm
G: 632-
698 nm
NiR: 757-
853 nm
Ikonos
67Compiled by : Dr. Zubairul Islam
Chapter – 4 - Digital image processing
4.1 - Raster Bands
4.2 - Software Demonstration: Arc GIS 9.3
4.3 – General Exercises
Lab 4.3.1 – Add Raster Data into ArcMap
Lab 4.3.2 - Clip image using ArcGIS and Spatial Analyst
4.4 – Radiometric Corrections
Lab 4.4 .1 - Conversion of DN Values to reflectance
4.5 – Band Combinations
Lab 4.5.1 - Band Combinations with Arc Map
4.6 – Image Classification
Lab 4.6.1 - Image Classification
4.7 – Land Use map
Lab 4.7.1 - Create land use map
68Compiled by : Dr. Zubairul Islam
4.1 - Raster bands
rasters may be single band or multiple bands. An example of a single-band raster
dataset is a digital elevation model (DEM). Each cell in a DEM contains only one
value representing surface elevation. Most satellite imagery has multiple bands,
typically containing values within a range or band of the electromagnetic
spectrum.
There are three main ways to display single-band raster datasets:
1. Using two colors—In a binary image, each cell has a value of 0 or 1 and is
often displayed using black and white. This type of display is often used for
displaying scanned maps with simple line work, such as parcel maps.
2. Grayscale—In a grayscale image, each cell has a value from 0 to another
number, such as 255 or 65535. These are often used for black-and-white
aerial photographs.
3. Color map—One way to represent colors on an image is with a color map. A
set of values is coded to match a defined set of red, green, and blue (RGB)
values.
69Compiled by : Dr. Zubairul Islam
70Compiled by : Dr. Zubairul Islam
When there are multiple bands, every cell
location has more than one value associated
with it. With multiple bands, each band usually
represents a segment of the electromagnetic
spectrum collected by a sensor. Bands can
represent any portion of the electromagnetic
spectrum, including ranges not visible to the
eye, such as the infrared or ultraviolet sections.
The term band originated from the reference to
the color band on the electromagnetic
spectrum.
When you create a map layer from a raster
image, you can choose to display a single band
of data or form a color composite from multiple
bands. A combination of any three of the
available bands in a multiband raster dataset
can be used to create RGB composites. By
displaying bands together as RGB composites,
often more information is gleaned from the
dataset than if you were to work with just one
band.
71Compiled by : Dr. Zubairul Islam
A satellite image, for example, commonly has multiple bands representing different
wavelengths from the ultraviolet through the visible and infrared portions of the
electromagnetic spectrum. Landsat imagery, for example, is data collected from seven
different bands of the electromagnetic spectrum. Bands 1–7, including 6, represent data
from the visible, near infrared, and midinfrared regions. Band 6 collects data from the
thermal infrared region. Another example of a multiband image is a true color orthophoto
in which there are three bands, each representing either red, green, or blue light.
72Compiled by : Dr. Zubairul Islam
PIXEL VALUES
A raster consists of a matrix of cells (or pixels) organized into rows and columns (or a grid)
where each cell contains a value representing information.
Rasters are digital aerial photographs, imagery from satellites, digital pictures, or even
scanned maps.
USES OF RASTER DATA
Structure of raster data is simple, it is exceptionally useful for a wide range of
applications. Within a GIS, the uses of raster data fall under four main categories:
Rasters as basemaps
A common use of raster data in a GIS is as a
background display for other feature layers.
Below is a raster used as a basemap for road
data.
73Compiled by : Dr. Zubairul Islam
Rasters as surface maps
Rasters are well suited for representing data that changes continuously across a
landscape (surface). Elevation values measured from the earth's surface are the most
common application of surface maps, but other values, such as rainfall, temperature,
concentration, and population density, can also define surfaces that can be spatially
analyzed.
The raster below displays elevation—using green to show lower elevation and red, pink,
and white cells to show higher elevation.
74Compiled by : Dr. Zubairul Islam
Rasters as thematic maps
Rasters representing thematic data can be derived from
analyzing other data. A common analysis application is
classifying a satellite image by land-cover categories.
Basically, this activity groups the values of multispectral
data into classes (such as vegetation type) and assigns a
categorical value.
Below is an example of a classified raster dataset showing
land use.
75Compiled by : Dr. Zubairul Islam
General characteristics of raster data
In raster datasets, each cell (which is also known as a pixel) has a value. The cell
values represent the phenomenon. The category could be a land-use class such
as grassland, forest, or road. A magnitude might represent gravity, noise
pollution, or percent rainfall. Height could represent surface elevation above
mean sea level, which can be used to derive slope, aspect, and watershed
properties. Spectral values are used in satellite imagery and aerial photography to
represent light reflectance and color.
Cell values can be either
positive or negative, integer,
or floating point. Integer
values are best used to
represent categorical
(discrete) data, and floating-
point values to represent
continuous surfaces. Cells can
also have a NoData value to
represent the absence of data.
76Compiled by : Dr. Zubairul Islam
The area (or surface) represented by
each cell consists of the same width and
height and is an equal portion of the
entire surface represented by the raster.
For example, a raster representing
elevation (that is, digital elevation
model) may cover an area of 100 square
kilometers. If there were 100 cells in
this raster, each cell would represent
one square kilometer of equal width
and height (that is, 1 km x 1 km).
77Compiled by : Dr. Zubairul Islam
The dimension of the cells can be as large or as small as needed to
represent the surface conveyed by the raster dataset and the
features within the surface, such as a square kilometer, square foot,
or even a square centimeter. The cell size determines how coarse or
fine the patterns or features in the raster will appear.
The smaller the cell size, the smoother or more
detailed the raster will be. However, the greater
the number of cells, the longer it will take to
process, and it will increase the demand for
storage space. If a cell size is too large,
information may be lost or subtle patterns may
be obscured. For example, if the cell size is
larger than the width of a road, the road may
not exist within the raster dataset. In the
diagram below, you can see how this simple
polygon feature will be represented by a raster
dataset at various cell sizes. 78Compiled by : Dr. Zubairul Islam
4.2 - SOFTWARE DEMONSTRATION
An overview of ArcMap
ArcMap is where you display and explore the datasets for your study area, where you
assign symbols, and where you create map layouts for printing or publication. ArcMap
is also the application you use to create and edit datasets.
ArcMap represents geographic information as a collection of layers and other elements
in a map. Common map elements include the data frame containing map layers for a
given extent plus a scale bar, north arrow, title, descriptive text, a symbol legend, and so
on.
ArcMap documents
When you save a map you have created in ArcMap, it will be saved as a file on disk. A
filename extension (.mxd) will be automatically appended to your map document
name. You can work with an existing .mxd by double-clicking the document to open it.
This will start an ArcMap session for that .mxd.
Map documents contain display properties of the geographic information you work
with in the map—such as the properties and definitions of your map layers, data
frames, and the map layout for printing—plus any optional customizations and macros
that you add to your map.
79Compiled by : Dr. Zubairul Islam
Views in ArcMap
ArcMap displays map contents in one of two views:
1. Data view
2. Layout view
Each view lets you look at and interact with the map in a specific way.
In data view, the active data frame is presented as a geographic window in which map
layers are displayed and used.
80Compiled by : Dr. Zubairul Islam
Layout view is used to design and author a map for printing, exporting, or publishing.
You can manage map elements within the page space (typically, inches or centimeters),
add new map elements, and preview what your map will look like before exporting or
printing it. Common map elements include: data frames with map layers, scale bars,
north arrows, symbol legends, map titles, text, and other graphical elements.
81Compiled by : Dr. Zubairul Islam
LAB 4.3.1 – Add Raster Data into ArcMap
1. Open ArcMap- it is one of several programs within the package titled ArcGIS. Once
the program is open it will prompt you with this window:
2. Make sure “A new empty map” is selected and click “Ok.”
4.3 – GENERAL EXERCISES
82Compiled by : Dr. Zubairul Islam
3. To add data to your map, click on the “Add Data” icon. Navigate to the folder and
one of the Landsat scenes you are interested in mosaicking and click “Add.”
4. When ArcMap prompts you with this window:
Click “Yes” and wait for the few moments it takes to display the data (A progress
bar usually appears in the lower right-hand corner of the screen)
5. Repeat steps 3 and 4 for the other scenes you wish .
83Compiled by : Dr. Zubairul Islam
LAB 4.3.2 - Clip image using ArcGis
STEPS
1. Add data – Raster + Vector
2. Data management tool
3. Rater
4. Rater processing
5. Clip
6. Input Rater
7. Output extent
8. Use input feature for clipping geometry
9. Output raster data
10. No data value - 0
11. OK
12. Result
Result
Note – Repeat the same exercise for band 2,3,4,5.
84Compiled by : Dr. Zubairul Islam
4.4 - Radiometric correction
Radiometric correction is to avoid radiometric errors or
distortions.
When reflected electro-magnetic energy is observed by a
sensor, the observed energy does not match with the
energy reflected from the same object observed from a
short distance. This is due to the sun's azimuth and
elevation, atmospheric conditions such as fog or aerosols,
sensor's response etc. which influence the observed
energy. Therefore, in order to obtain the real reflectance,
those radiometric distortions must be corrected.
85Compiled by : Dr. Zubairul Islam
Radiometric correction involve change of DN values to radience or
reflectance.
Where as
Radiance is the intensity of radiation leaving ground to some
direction, its unit of measurement is W/m2/sr.
Reflectance is the ratio between Radiance & incoming irradiance.
86Compiled by : Dr. Zubairul Islam
Radiometric correction of Landsat 8 Data
The standard Landsat 8 products provided by the USGS EROS
Center consist of quantized and calibrated scaled Digital Numbers
(DN) representing multispectral image data acquired by both the
Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS).
Background
The products are delivered in 16-bit unsigned integer format and can
be rescaled to the Top Of Atmosphere (TOA) reflectance and/or
radiance using radiometric rescaling coefficients provided in the
product metadata file (MTL file), as briefly described below.
87Compiled by : Dr. Zubairul Islam
Conversion to TOA Radiance
OLI and TIRS band data can be converted to TOA spectral radiance
using the radiance rescaling factors provided in the metadata file:
Lλ = MLQcal + AL ……. Eq. 01
where:
Lλ = TOA spectral radiance (Watts/( m2 * srad * μm))
ML = Band-specific multiplicative rescaling factor from the metadata
(RADIANCE_MULT_BAND_x, where x is the band number)
AL = Band-specific additive rescaling factor from the metadata
(RADIANCE_ADD_BAND_x, where x is the band number)
Qcal = Quantized and calibrated standard product pixel values (DN)
88Compiled by : Dr. Zubairul Islam
Conversion to TOA Reflectance
OLI band data can also be converted to TOA planetary reflectance
using reflectance rescaling coefficients provided in the product
metadata file (MTL file). The following equation is used to convert
DN values to TOA reflectance for OLI data as follows:
ρλ' = MρQcal + Aρ ……. Eq. 02
Where:
ρλ' = TOA planetary reflectance, without correction for solar angle.
Mρ = Band-specific multiplicative rescaling factor from the metadata
(REFLECTANCE_MULT_BAND_x, where x is the band number)
Aρ = Band-specific additive rescaling factor from the metadata
(REFLECTANCE_ADD_BAND_x, where x is the band number)
Qcal = Quantized and calibrated standard product pixel values (DN)
89Compiled by : Dr. Zubairul Islam
TOA reflectance with a correction for the sun angle is then:
where:
ρλ = TOA planetary reflectance
θSE = Local sun elevation angle.
The scene center sun elevation angle in degrees is provided
in the metadata (SUN_ELEVATION).
……. Eq. 03
So for reflectance formula may be used as.
MρQcal + Aρ
sin(θSE )
ρλ =
90Compiled by : Dr. Zubairul Islam
LAB 4.4.1 - Landsat 8 data - conversion of DN values to reflectance with ArcGis
1. Add Data
2. Open Spatial Analyst
3. Rater Calculator
4. Give formula
5. Evaluate
6. Right click on calculation
7. Data
8. Make Permanent
Save to your folder
RESULT
Note : Reflectance values varies 0 to + 1%, It varies from one
material to another material.
Repeat the same exercise for band 2,3,4,5.
91Compiled by : Dr. Zubairul Islam
4.5 - Band Combinations
Landsat 8 measures different ranges of frequencies
along the electromagnetic spectrum – a color, although
not necessarily a color visible to the human eye. Each
range is called a band, and Landsat 8 has 11 bands.
Landsat numbers its red, green, & blue sensors as 4,
3, & 2, so when we combine them we get a true-color
image such as this one:
92Compiled by : Dr. Zubairul Islam
STEPS FOR CREATING TRUE COLOR COMBINATION OF IMAGES
1. Add Data
2. Data Management tool
3. Raster
4. Raster Processing
5. Combination Bands
6. Input Raster
7. Output Location
8. OK
RESULT -
Note : In this image green color will be for vegetation, Brown
for barren and rocky land and blue for water.
LAB 4.5 – Band Combinations of Images
93Compiled by : Dr. Zubairul Islam
The goal of classification is to assign each cell in the
study area to a known class (supervised classification)
or to a cluster (unsupervised classification).
In both cases, the input to classification is a signature
file containing the multivariate statistics of each class
or cluster.
The result of classification is a map that partitions the
study area into classes. Classifying locations into
naturally occurring classes corresponding to clusters is
also referred to as stratification.
4.6 – IMAGE CLASSIFICATION
94Compiled by : Dr. Zubairul Islam
Unsupervised classifications
In an unsupervised classification, you divide the study area into a
specified number of statistical clusters. These clusters are then
interpreted into meaningful classes.
Unsupervised classification is done in two stages as follows:
A. Creating Iso-cluster
B. Creating Maximum likelihood classification
A. Steps for Iso-cluster
1. Add data – Composite image
2. Spatial Analysis Tool
3. Multivariate
4. Iso-cluster
5. Input raster bands
6. Check output location
7. Number of classes
8. Ok
LAB 4.6.1 - Unsupervised classifications
95Compiled by : Dr. Zubairul Islam
B. Steps for Creating Maximum likelihood classification
1. Spatial Analysis Tool
2. Multivariate
3. Maximum likelihood classification
4. Input raster bands
5. Input signature file
6. Output classified Raster
7. Ok
RESULT
Note : Image has been divided into 3 clusters
96Compiled by : Dr. Zubairul Islam
LAB 4.7.1 - Create land use map with classified raster image created in last exercise
1. Right click on classified image
2. Property
3. Symbology
4. Double click on colored box
5. Change land use type
6. Apply
7. OK
4.7 – Land use Map
Land use map shows the types and intensities of different land uses in a particular
area. For land use map the image classified in last exercise may be used as follows:
This exercise comprises 2 steps
First step
Cont. to second step
97Compiled by : Dr. Zubairul Islam
Second step
1. Click layout view
2. Use tools to set map at right place
3. Give title
4. legend (set Item)
5. Scale bar (Set Properties)
6. File – Export – Save
RESULT
Note – This map may be used for your work
98Compiled by : Dr. Zubairul Islam
END OF CHAPTER 4
99Compiled by : Dr. Zubairul Islam
CHAPTER FIVE
GPS and remote sensing
5.1. Introduction
5.2. Satellite based positioning
5.3. General Functions of Garmin GPS – 60
5.4. Exercise – Create Waypoints
100Compiled by : Dr. Zubairul Islam
• GPS and remote sensing imagery are primary GIS data
sources, and are very important GIS data sources.
• GPS data creates points (positions), polylines, or polygons
for GIS
• Remote sensing imagery are used as major basis map in
GIS
5.1. Introduction
101Compiled by : Dr. Zubairul Islam
5.2. Satellite based positioning
 GPS is a Satellite Navigation System
 GPS is funded and controlled by the U. S. Department of
Defense (DOD). While there are many thousands of civil users
of GPS world-wide, the system was designed for and is
operated by the U. S. military.
 GPS provides specially coded satellite signals that can be
processed in a GPS receiver, enabling the receiver to compute
position, velocity and time.
 At least 4 satellites are used to estimate 4 quantities: position in
3-D (X, Y, Z) and GPSing time (T)
 The nominal GPS Operational Constellation consists of 24
satellites that orbit the earth.
102Compiled by : Dr. Zubairul Islam
5.3 - General Functions of Garmin GPS - 60
1. To turn the unit on/off press black button (hold down to turn off)
2. Use ROCKER button to move cursor up and down/left and right
3. To page through/between screens press PAGE/QUIT buttons
4. To find the menu press MENU button twice
5. Use ENTER key to select highlighted fields, enter data, or confirm on screen messages.
6. To cancel out of screen and not save press QUIT
7. Use MARK key to mark your current location as way point.
8. Use FIND key to view the find page.
9. To zoom in map screen press IN/OUT button
103Compiled by : Dr. Zubairul Islam
•Press and release “Mark”
•Edit name field (top)
•“OK”
5.4 – Exercise to create waypoints
104Compiled by : Dr. Zubairul Islam
5.5 – Import Waypoints with Google earth
1. Tools
2. GPS
3. Import way points
4. Import
Start Google earth &
connect GPS with
Your PC
105Compiled by : Dr. Zubairul Islam
106Compiled by : Dr. Zubairul Islam
The End

More Related Content

What's hot

Types of aerial photographs
Types of aerial photographsTypes of aerial photographs
Types of aerial photographsHARITHANAIR15
 
IMAGE INTERPRETATION TECHNIQUES of survey
IMAGE INTERPRETATION TECHNIQUES of surveyIMAGE INTERPRETATION TECHNIQUES of survey
IMAGE INTERPRETATION TECHNIQUES of surveyKaran Patel
 
Remote sensing-presentaion
Remote sensing-presentaionRemote sensing-presentaion
Remote sensing-presentaionMouna Guru
 
Basics of remote sensing, pk mani
Basics of remote sensing, pk maniBasics of remote sensing, pk mani
Basics of remote sensing, pk maniP.K. Mani
 
Remote sensing - Scanners
Remote sensing - ScannersRemote sensing - Scanners
Remote sensing - ScannersPramoda Raj
 
Fundamentals of Remote Sensing
Fundamentals of Remote Sensing Fundamentals of Remote Sensing
Fundamentals of Remote Sensing Pallab Jana
 
Remote sensing and image interpretation
Remote sensing and image interpretationRemote sensing and image interpretation
Remote sensing and image interpretationMd. Nazir Hossain
 
Multispectral remote sensing
Multispectral remote sensingMultispectral remote sensing
Multispectral remote sensingDharmendera Meena
 
Chapter 1 (Introduction to remote sensing)
Chapter 1 (Introduction to remote sensing)Chapter 1 (Introduction to remote sensing)
Chapter 1 (Introduction to remote sensing)Shankar Gangaju
 
Introduction to Landsat
Introduction to LandsatIntroduction to Landsat
Introduction to LandsatNizam GIS
 
Remote sensing
Remote sensingRemote sensing
Remote sensingGokul Saud
 
A Brief Introduction to Remote Sensing Satellites
A Brief Introduction to Remote Sensing Satellites A Brief Introduction to Remote Sensing Satellites
A Brief Introduction to Remote Sensing Satellites Alireza Rahimzadeganasl
 

What's hot (20)

georeference
georeferencegeoreference
georeference
 
Remote sensing
Remote sensingRemote sensing
Remote sensing
 
Types of aerial photographs
Types of aerial photographsTypes of aerial photographs
Types of aerial photographs
 
IMAGE INTERPRETATION TECHNIQUES of survey
IMAGE INTERPRETATION TECHNIQUES of surveyIMAGE INTERPRETATION TECHNIQUES of survey
IMAGE INTERPRETATION TECHNIQUES of survey
 
Remote sensing-presentaion
Remote sensing-presentaionRemote sensing-presentaion
Remote sensing-presentaion
 
Basics of remote sensing, pk mani
Basics of remote sensing, pk maniBasics of remote sensing, pk mani
Basics of remote sensing, pk mani
 
LISS
LISSLISS
LISS
 
GEOID-DETERMINAION
GEOID-DETERMINAIONGEOID-DETERMINAION
GEOID-DETERMINAION
 
Remote sensing - Scanners
Remote sensing - ScannersRemote sensing - Scanners
Remote sensing - Scanners
 
Fundamentals of Remote Sensing
Fundamentals of Remote Sensing Fundamentals of Remote Sensing
Fundamentals of Remote Sensing
 
Remote sensing and image interpretation
Remote sensing and image interpretationRemote sensing and image interpretation
Remote sensing and image interpretation
 
Digital Elevation Model (DEM)
Digital Elevation Model (DEM)Digital Elevation Model (DEM)
Digital Elevation Model (DEM)
 
Landsat
LandsatLandsat
Landsat
 
Multispectral remote sensing
Multispectral remote sensingMultispectral remote sensing
Multispectral remote sensing
 
Basic of Geodesy
Basic of GeodesyBasic of Geodesy
Basic of Geodesy
 
Chapter 1 (Introduction to remote sensing)
Chapter 1 (Introduction to remote sensing)Chapter 1 (Introduction to remote sensing)
Chapter 1 (Introduction to remote sensing)
 
Introduction to Landsat
Introduction to LandsatIntroduction to Landsat
Introduction to Landsat
 
Remote sensing
Remote sensingRemote sensing
Remote sensing
 
Remote Sensing
Remote Sensing Remote Sensing
Remote Sensing
 
A Brief Introduction to Remote Sensing Satellites
A Brief Introduction to Remote Sensing Satellites A Brief Introduction to Remote Sensing Satellites
A Brief Introduction to Remote Sensing Satellites
 

Similar to 1 remote sensing

rsgis-unitii-160731062950.pdf
rsgis-unitii-160731062950.pdfrsgis-unitii-160731062950.pdf
rsgis-unitii-160731062950.pdfBSuresh26
 
Iirs lecure notes for Remote sensing –An Overview of Decision Maker
Iirs lecure notes for Remote sensing –An Overview of Decision MakerIirs lecure notes for Remote sensing –An Overview of Decision Maker
Iirs lecure notes for Remote sensing –An Overview of Decision MakerTushar Dholakia
 
Fundamentals of remote sensing
Fundamentals of remote sensingFundamentals of remote sensing
Fundamentals of remote sensingGhassan Hadi
 
Fundamentals of remonte sensing
Fundamentals of remonte sensingFundamentals of remonte sensing
Fundamentals of remonte sensingSi Mokrane SIAD
 
applicationsofremotesensingingeologicalaspects-170606133459.pdf
applicationsofremotesensingingeologicalaspects-170606133459.pdfapplicationsofremotesensingingeologicalaspects-170606133459.pdf
applicationsofremotesensingingeologicalaspects-170606133459.pdfCIVIL48
 
Applications of remote sensing in geological aspects
Applications of remote sensing in geological aspectsApplications of remote sensing in geological aspects
Applications of remote sensing in geological aspectsPramoda Raj
 
APPLICATION OF REMOTE SENSING AND GIS IN AGRICULTURE
APPLICATION OF REMOTE SENSING AND GIS IN AGRICULTUREAPPLICATION OF REMOTE SENSING AND GIS IN AGRICULTURE
APPLICATION OF REMOTE SENSING AND GIS IN AGRICULTURELagnajeetRoy
 
remote sensing for study.docx
remote sensing for study.docxremote sensing for study.docx
remote sensing for study.docxbbc37142
 
Basic of Remote Sensing
Basic of Remote SensingBasic of Remote Sensing
Basic of Remote Sensinggueste5cfed
 
Introduction to Remote Sensing- by Wankie Richman
Introduction to Remote Sensing- by Wankie RichmanIntroduction to Remote Sensing- by Wankie Richman
Introduction to Remote Sensing- by Wankie RichmanRichmanWankie
 

Similar to 1 remote sensing (20)

rsgis-unitii-160731062950.pdf
rsgis-unitii-160731062950.pdfrsgis-unitii-160731062950.pdf
rsgis-unitii-160731062950.pdf
 
Iirs lecure notes for Remote sensing –An Overview of Decision Maker
Iirs lecure notes for Remote sensing –An Overview of Decision MakerIirs lecure notes for Remote sensing –An Overview of Decision Maker
Iirs lecure notes for Remote sensing –An Overview of Decision Maker
 
Introduction to Remote Sensing
Introduction to Remote SensingIntroduction to Remote Sensing
Introduction to Remote Sensing
 
Remote sensing
 Remote sensing Remote sensing
Remote sensing
 
Fundamentals of remote sensing
Fundamentals of remote sensingFundamentals of remote sensing
Fundamentals of remote sensing
 
Remote Sensing
Remote SensingRemote Sensing
Remote Sensing
 
Fundamentals of remonte sensing
Fundamentals of remonte sensingFundamentals of remonte sensing
Fundamentals of remonte sensing
 
G044044249
G044044249G044044249
G044044249
 
Remort sensing
Remort sensingRemort sensing
Remort sensing
 
applicationsofremotesensingingeologicalaspects-170606133459.pdf
applicationsofremotesensingingeologicalaspects-170606133459.pdfapplicationsofremotesensingingeologicalaspects-170606133459.pdf
applicationsofremotesensingingeologicalaspects-170606133459.pdf
 
Applications of remote sensing in geological aspects
Applications of remote sensing in geological aspectsApplications of remote sensing in geological aspects
Applications of remote sensing in geological aspects
 
APPLICATION OF REMOTE SENSING AND GIS IN AGRICULTURE
APPLICATION OF REMOTE SENSING AND GIS IN AGRICULTUREAPPLICATION OF REMOTE SENSING AND GIS IN AGRICULTURE
APPLICATION OF REMOTE SENSING AND GIS IN AGRICULTURE
 
Remote Sensing
Remote SensingRemote Sensing
Remote Sensing
 
remote sensing for study.docx
remote sensing for study.docxremote sensing for study.docx
remote sensing for study.docx
 
Basic of Remote Sensing
Basic of Remote SensingBasic of Remote Sensing
Basic of Remote Sensing
 
Report
ReportReport
Report
 
remote sensing
remote sensingremote sensing
remote sensing
 
Introduction to Remote Sensing- by Wankie Richman
Introduction to Remote Sensing- by Wankie RichmanIntroduction to Remote Sensing- by Wankie Richman
Introduction to Remote Sensing- by Wankie Richman
 
Remote sensing and aerial photography
Remote sensing and aerial photographyRemote sensing and aerial photography
Remote sensing and aerial photography
 
rs&gis-theena.pptx
rs&gis-theena.pptxrs&gis-theena.pptx
rs&gis-theena.pptx
 

Recently uploaded

Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxChelloAnnAsuncion2
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.arsicmarija21
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........LeaCamillePacle
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationAadityaSharma884161
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 

Recently uploaded (20)

Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint Presentation
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 

1 remote sensing

  • 1. COURSE MATERIAL COURSE TITLE - FUNDAMENTALS OF REMOTE SENSING COURSE CODE - GeES2081 Module Name and Code: SPATIAL DATA ACQUISITION AND ANALYSIS, GeESM2081 ADIGRAT UNIVERSITY COLLEGE OF SOCIAL SCIENCE AND HUMANITIES DEPARTMENT OF GEOGRAPHY AND ENVIRONMENTAL STUDIES 1 Edited By - Dr. Zubairul Islam, Associate Professor, Department of geography and Environmental studies. zubairul@gmail.com Compiled by : Dr. Zubairul Islam
  • 2. COURSE CONTENT AND SCHEDULE OF ACTIVITIES Week / day CONTENTS SLIDE NO. 1st Week CHAPTER ONE: Introduction 6-19 Day 1 1.1. Definition of remote sensing 1.2. Elements of remote sensing process 1.3. Advantage of Remote sensing 1.4. Application of Remote sensing Day 2 CHAPTER TWO: Electromagnetic energy& remote sensing 20-38 2.1. Electromagnetic radiation 2.2. Electromagnetic spectrum 2.3. Energy interaction in the atmosphere 2.4. Energy interaction with the earth’s surface Day 3&4 CHAPTER THREE: Satellites and images characteristics 39-67 3.1. Satellite Sensors and platforms 3.2 Satellite Characteristics: Orbits and Swaths 3.3. Satellite Image data characteristics 3.4. Image Specifications of Multispectral Satellite Sensors 2Compiled by : Dr. Zubairul Islam
  • 3. 2nd Week CHAPTER FOUR: DIGITAL IMAGE PROCESSING 68-99 Day 1 4.1 - Raster Bands Day 1 4.2 - Software Demonstration: Arc GIS 9.3 Day 2 4.3 – General Exercises with Arc GIS 9.3 Lab 4.3.1 – Add Raster Data Lab 4.3.2 - Clip image using Lab 4.3.3 - Creating multiband dataset Day 3 4.4 – Radiometric Corrections Lab 4.4 .1 - Conversion of DN Values to reflectance Day 4 4.5 – Band Combinations Lab 4.5.1 - Band Combinations Day 4 4.6 – Image Classification & Land use mapping Lab 4.6.1 - Image Classification Day 5 4.7 – Land use Mapping Lab 4.7.1 - Land use Mapping 3Compiled by : Dr. Zubairul Islam
  • 4. 3rd Week CHAPTER FIVE: GPS And Remote Sensing 100-106 Day 1 5.1. Introduction Day 1 5.2. Satellite based positioning Day 2 5.3. Lab - General Functions of Garmin GPS – 60 Day 3 5.4. Lab – Create Waypoints 4Compiled by : Dr. Zubairul Islam
  • 5. RECOMMENDED MODE OF ASSESSMENT Continuous assessment………………… (60%)  Test 1& 2…10 + 10………1st week Friday After noon & Saturday Morning Shift – Ch 1,2&3  Lab 1& 2… 10 + 10………2nd week Friday After noon & Saturday Morning Shift– Ch 4  Group project work …..20…..3rd week Friday for submission – Ch 5 Final exam…………………………….40% ………… 20% from chapter 1 to 3 & 20 % from chapter 4&5 5Compiled by : Dr. Zubairul Islam
  • 6. Chapter One: Introduction 1.1. Definition of remote sensing 1.2 Elements of remote sensing process 1.3 Advantages of Remote Sensing 1.4 Applications of Remote Sensing 6Compiled by : Dr. Zubairul Islam
  • 7. 1.1. Definition of remote sensing "Remote sensing is the science (and to some extent, art) of acquiring information about the Earth's surface without actually being in contact with it. This is done by sensing and recording reflected or emitted energy and processing, analyzing, and applying that information." 7Compiled by : Dr. Zubairul Islam
  • 8. 1.2 Elements of remote sensing process There are seven elements comprise the remote sensing process from beginning to end. 1. Energy Source or Illumination (A) - the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. 2. Radiation and the Atmosphere (B) – as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the 3. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation. 4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation. 8Compiled by : Dr. Zubairul Islam
  • 9. 5. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital). 7. Application (G) - the final element of the remote sensing process is achieved when we apply the information we have been able to extract from the imagery about the target in order to better understand it, reveal some new information, or assist in solving a particular problem. 6. Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated. 9Compiled by : Dr. Zubairul Islam
  • 10. 1.3 Advantages of Remote Sensing 1. Enables to observe a broad area at a time 2. Enables to observe the area for a long period -- Repeat pass observation (Time series data, Change detection) 3. Enables to know the condition without visiting the area 4. Enables to know invisible information – Sensors for various electromagnetic spectrum (Infrared, microwave) 10Compiled by : Dr. Zubairul Islam
  • 11. 1.2. Applications of Remote Sensing Remote sensing has enabled mapping, studying, monitoring and management of various resources like agriculture, forestry, geology, water, ocean etc. It has further enabled monitoring of environment and thereby helping in conservation. In the last four decades it has grown as a major tool for collecting information on almost every aspect on the earth. With the availability of very high spatial resolution satellites in the recent years, the applications have multiplied. 11Compiled by : Dr. Zubairul Islam
  • 12. Current major applications of remote sensing may as following: 1. Geology 2. Forestry 3. Change detection 4. Oceanography 5. Meteorology 6. Land degradation 7. Land use applications 12Compiled by : Dr. Zubairul Islam
  • 13. Geology In geology, for instance, remote sensing can be applied to analyze and map large, remote areas. Remote sensing interpretation also makes it easy for geologists in this case to identify an area's rock types, geomorphology, and changes from natural events such as a flood or landslide. 13Compiled by : Dr. Zubairul Islam
  • 14. Forestry Satellite imagery is used to identify and map: - • The species of native and exotic forest trees. • The effects of major diseases or adverse change in environmental conditions. • The geographic extent of forests. etc 14Compiled by : Dr. Zubairul Islam
  • 15. Satellite imagery is not always able to provide exact details about the species or age of vegetation. However, the imagery provides a very good means of measuring significant change in vegetation cover, whether it is through clearing, wildfire damage or environmental stress. The most common form of environmental stress is water deficiency. Change detection 15Compiled by : Dr. Zubairul Islam
  • 16. Oceanography Remote sensing is applied to oceanography studies. Remote sensing is used, for example, to measure sea surface temperature and to monitor marine habitats. 16Compiled by : Dr. Zubairul Islam
  • 17. Meteorology Remote sensing is an effective method for mapping cloud type and extent, and cloud top temperature. 17Compiled by : Dr. Zubairul Islam
  • 18. Land degradation Imagery can be used to map areas of poor or no vegetation cover. A range of factors, including saline or sodic soils, and overgrazing, can cause degraded landscapes. 18Compiled by : Dr. Zubairul Islam
  • 19. land use applications Additionally, those studying urban and other land use applications are also concerned with remote sensing because it allows them to easily pick out which land uses are present in an area. This can then be used as data in city planning applications and the study of species habitat, for example. 19Compiled by : Dr. Zubairul Islam
  • 20. Chapter Two Electromagnetic energy and remote sensing 2.1. Electromagnetic radiation 2.2. Electromagnetic spectrum 2.3 Energy interaction in the atmosphere 2.4 Energy interaction with the earth’s surface 20Compiled by : Dr. Zubairul Islam
  • 21. The first requirement for remote sensing is to have an energy source to illuminate the target (unless the sensed energy is being emitted by the target). This energy is in the form of electromagnetic radiation. 2.1 Electromagnetic Radiation Two characteristics of electromagnetic radiation are particularly important for understanding remote sensing. These are the wavelength and frequency. The wavelength is the length of one wave cycle, which can be measured as the distance between successive wave crests. Wavelength is usually represented by the Greek letter lambda (λ). Wavelength is measured in metres (m) or some factor of metres such as nanometres (nm, 10-9 metres), micrometres (μm, 10-6 metres) (μm, 10-6 metres) or centimetres (cm, 10-2 metres). Frequency refers to the number of cycles of a wave passing a fixed point per unit of time. wavelength and frequency are inversely related to each other. The shorter the wavelength, the higher the frequency. 21Compiled by : Dr. Zubairul Islam
  • 22. 2.2 - The Electromagnetic Spectrum The electromagnetic (EM) spectrum is the continuous range of electromagnetic radiation, extending from gamma rays (highest frequency & shortest wavelength) to radio waves (lowest frequency & longest wavelength) and including visible light. The EM spectrum can be divided into seven different regions — gamma rays, X-rays, ultraviolet, visible light, infrared, microwaves and radio waves. 22Compiled by : Dr. Zubairul Islam
  • 23. Remote sensing involves the measurement of energy in many parts of the electromagnetic (EM) spectrum. The major regions of interest in satellite sensing are visible light, reflected and emitted infrared, and the microwave regions. The measurement of this radiation takes place in what are known as spectral bands. A spectral band is defined as a discrete interval of the EM spectrum. For example the wavelength range of 0.4μm to 0.5μm (μm = micrometers or 10-6m) is one spectral band. 23Compiled by : Dr. Zubairul Islam
  • 24. Satellite sensors have been designed to measure responses within particular spectral bands to enable the discrimination of the major Earth surface materials. Scientists will choose a particular spectral band for data collection depending on what they wish to examine. The design of satellite sensors is based on the absorption characteristics of Earth surface materials across all the measurable parts in the EM spectrum. 24Compiled by : Dr. Zubairul Islam
  • 25. Visible Spectrum (1) The light which our eyes - our "remote sensors" - can detect is part of the visible spectrum. It is important to recognize how small the visible portion is relative to the rest of the spectrum. There is a lot of radiation around us which is invisible to our eyes, but can be detected by other remote sensing instruments and used to our advantage. It is important to note that this is the only portion of the EM spectrum we can associate with the concept of colours. 25Compiled by : Dr. Zubairul Islam
  • 26. Blue, green, and red are the primary colors or wavelengths of the visible spectrum. They are defined as such because no single primary color can be created from the other two, but all other colors can be formed by combining blue, green, and red in various proportions. Although we see sunlight as a uniform or homogeneous color, it is actually composed of various wavelengths of radiation in primarily the ultraviolet, visible and infrared portions of the spectrum. The visible portion of this radiation can be shown in its component colors when sunlight is passed through a prism. 26Compiled by : Dr. Zubairul Islam
  • 27. Infrared (IR) Region The IR Region covers the wavelength range from approximately 0.7 μm to 100 μm - more than 100 times as wide as the visible portion! The IR region can be divided into two categories based on their radiation properties - the reflected IR, and the emitted or thermal IR. 27Compiled by : Dr. Zubairul Islam
  • 28. Reflected and Thermal IR Radiation in the reflected IR region is used for remote sensing purposes in ways very similar to radiation in the visible portion. The reflected IR covers wavelengths from approximately 0.7 μm to 3.0 μm. The thermal IR region is quite different than the visible and reflected IR portions, as this energy is essentially the radiation that is emitted from the Earth's surface in the form of heat. The thermal IR covers wavelengths from approximately 3.0 μm to 100 μm. 28Compiled by : Dr. Zubairul Islam
  • 29. Microwave Region The portion of the spectrum of more recent interest to remote sensing is the microwave region from about 1 mm to 1 m. This covers the longest wavelengths used for remote sensing. The shorter wavelengths have properties similar to the thermal infrared region while the longer wavelengths approach the wavelengths used for radio broadcasts. 29Compiled by : Dr. Zubairul Islam
  • 30. 30Compiled by : Dr. Zubairul Islam
  • 31. 2.3 Interactions with the Atmosphere Before radiation used for remote sensing reaches the Earth's surface it has to travel through some distance of the Earth's atmosphere. Particles and gases in the atmosphere can affect the incoming light and radiation. These effects are caused by the mechanisms of Scattering and absorption. Scattering occurs when particles or large gas molecules present in the atmosphere interact with and cause the electromagnetic radiation to be redirected from its original path. How much scattering takes place depends on several factors including the wavelength of the radiation, the abundance of particles or gases, and the distance the radiation travels through the atmosphere. There are three (3) types of scattering which take place. Rayleigh scattering occurs when particles are very small compared to the wavelength of the radiation. These could be particles such as small specks of dust or nitrogen and oxygen molecules. Rayleigh scattering causes shorter wavelengths of energy to be scattered much more than longer wavelengths. Rayleigh scattering is the dominant scattering mechanism in the upper atmosphere. The fact that the sky appears "blue" during the day is because of this Phenomenon. Red color at sunrise and sunset is also because of this phenomena. Scattering 31Compiled by : Dr. Zubairul Islam
  • 32. Mie scattering occurs when the particles are just about the same size as the wavelength of the radiation. Dust, pollen, smoke and water vapour are common causes of Mie scattering which tends to affect longer wavelengths than those affected by Rayleigh scattering. Mie scattering occurs mostly in the lower portions of the atmosphere where larger particles are more abundant, and dominates when cloud conditions are overcast. The final scattering mechanism of importance is called nonselective scattering. This occurs when the particles are much larger than the wavelength of the radiation. Water droplets and large dust particles can cause this type of scattering. Nonselective scattering gets its name from the fact that all wavelengths are scattered about equally. This type of scattering causes fog and clouds to appear white to our eyes because blue, green, and red light are all scattered in approximately equal quantities (blue+green+red light = white light). 32Compiled by : Dr. Zubairul Islam
  • 33. Absorption is the other main mechanism at work when electromagnetic radiation interacts with the atmosphere. In contrast to scattering, this phenomenon causes molecules in the atmosphere to absorb energy at various wavelengths. Ozone, carbon dioxide, and water vapour are the three main atmospheric constituents which absorb radiation. Ozone serves to absorb the harmful (to most living things) ultraviolet radiation from the sun. Without this protective layer in the atmosphere our skin would burn when exposed to sunlight. carbon dioxide referred to as a greenhouse gas. This is because it tends to absorb radiation strongly in the far infrared portion of the spectrum - that area associated with thermal heating - which serves to trap this heat inside the atmosphere. Water vapour in the atmosphere absorbs much of the incoming longwave infrared and shortwave microwave radiation (between 22μm and 1m). The presence of water vapour in the lower atmosphere varies greatly from location to location and at different times of the year. For example, the air mass above a desert would have very little water vapour to absorb energy, while the tropics would have high concentrations of water vapour (i.e. high humidity). Absorption 33Compiled by : Dr. Zubairul Islam
  • 34. Because these gases absorb electromagnetic energy in very specific regions of the spectrum, they influence where (in the spectrum) we can "look" for remote sensing purposes. Those areas of the spectrum which are not severely influenced by atmospheric absorption and thus, are useful to remote sensors, are called atmospheric windows. By comparing the characteristics of the two most common energy/radiation sources (the sun and the earth) with the atmospheric windows available to us, we can define those wavelengths that we can use most effectively for remote sensing. The visible portion of the spectrum, to which our eyes are most sensitive, corresponds to both an atmospheric window and the peak energy level of the sun. Note also that heat energy emitted by the Earth corresponds to a window around 10 μm in the thermal IR portion of the spectrum, while the large window at wavelengths beyond 1 mm is associated with the microwave region. 34Compiled by : Dr. Zubairul Islam
  • 35. 2.4 Energy interaction with the earth’s surface Radiation that is not absorbed or scattered in the atmosphere can reach and interact with the Earth's surface. There are three (3) forms of interaction that can take place when energy strikes, or is incident (I) upon the surface. These are: absorption (A); transmission (T); and reflection (R). The total incident energy will interact with the surface in one or more of these three ways. The proportions of each will depend on the wavelength of the energy and the material and condition of the feature. Absorption (A) occurs when radiation (energy) is absorbed into the target while transmission (T) occurs when radiation passes through a target. Reflection (R) occurs when radiation "bounces" off the target and is redirected. In remote sensing, we are most interested in measuring the radiation reflected from targets. We refer to two types of reflection, which represent the two extreme ends of the way in which energy is reflected from a target: specular reflection and diffuse reflection. 35Compiled by : Dr. Zubairul Islam
  • 36. Let's take a look at a couple of examples of targets at the Earth's surface and how energy at the visible and infrared wavelengths interacts with them. Leaves: A chemical compound in leaves called chlorophyll strongly absorbs radiation in the red and blue wavelengths but reflects green wavelengths. Leaves appear "greenest“ to us in the summer, when chlorophyll content is at its maximum. In autumn, there is less chlorophyll in the leaves, so there is less absorption and proportionately more reflection of the red wavelengths, making the leaves appear red or yellow (yellow is a combination of red and green wavelengths). The internal structure of healthy leaves act as excellent diffuse reflectors of near-infrared wavelengths. If our eyes were sensitive to near-infrared, trees would appear extremely bright to us at these wavelengths. In fact, measuring and monitoring the near-IR reflectance is one way that scientists can determine how healthy (or unhealthy) vegetation may be. 36Compiled by : Dr. Zubairul Islam
  • 37. Water: Longer wavelength visible and near infrared radiation is absorbed more by water than shorter visible wavelengths. Thus water typically looks blue or blue-green due to stronger reflectance at these shorter wavelengths, and darker if viewed at red or near infrared wavelengths. If there is suspended sediment present in the upper layers of the water body, then this will allow better reflectivity and a brighter appearance of the water. The apparent colour of the water will show a slight shift to longer wavelengths. Suspended sediment (S) can be easily confused with shallow (but clear) water, since these two phenomena appear very similar. Chlorophyll in algae absorbs more of the blue wavelengths and reflects the green, making the water appear more green in colour when algae is present. The topography of the water surface (rough, smooth, floating materials, etc.) can also lead to complications for water-related interpretation due to potential problems of specular reflection and other influences on colour and brightness. 37Compiled by : Dr. Zubairul Islam
  • 38. We can see from these examples that, depending on the complex make-up of the target that is being looked at, and the wavelengths of radiation involved, we can observe very different responses to the mechanisms of absorption, transmission, and reflection. By measuring the energy that is reflected (or emitted) by targets on the Earth's surface over a variety of different wavelengths, we can build up a spectral response for that object. By comparing the response patterns of different features we may be able to distinguish between them, where we might not be able to, if we only compared them at one wavelength. For example, water and vegetation may reflect somewhat similarly in the visible wavelengths but are almost always separable in the infrared. Spectral response can be quite variable, even for the same target type, and can also vary with time (e.g. "green-ness" of leaves) and location. Knowing where to "look" spectrally and understanding the factors which influence the spectral response of the features of interest are critical to correctly interpreting the interaction of electromagnetic radiation with the surface. 38Compiled by : Dr. Zubairul Islam
  • 39. CHAPTER THREE: SATELLITE AND IMAGES CHARACTERISTICS 3.1. Satellite Sensors and platforms 3.1.1 - Airborne remote sensing 3.1.2. Space borne remote sensing 3.2 Satellite Characteristics: Orbits and Swaths 3.3. Satellite Image data characteristics 3.4. Image Specifications of Multispectral Satellite Sensors 39Compiled by : Dr. Zubairul Islam
  • 40. A sensor is a device that measures and records electromagnetic energy. 3.1. Satellite Sensors and platforms Sensor Sensors can be divided into two groups. 1. Passive sensors depend on an external source of energy, usually the sun. Most of the satellite sensors are passive. 2. Active sensors have their own source of energy, an example would be a radar gun. These sensors send out a signal and measure the amount reflected back. Active sensors are more controlled because they do not depend upon varying illumination conditions 40Compiled by : Dr. Zubairul Islam
  • 41. In order for a sensor to collect and record energy reflected or emitted from a target or surface, it must reside on a stable platform removed from earth’s surface being observed. Platforms for remote sensors may be situated on the ground, on an aircraft or balloon (or some other platform within the Earth's atmosphere), or on a spacecraft or satellite outside of the Earth's atmosphere. Platforms 41Compiled by : Dr. Zubairul Islam
  • 42. 3.1.1 - Airborne remote sensing Airborne remote sensing is the oldest and most widely used method of remote sensing. Cameras mounted in light aircraft flying between 200 and 15,000 m capture a large quantity of detailed information. Aerial photos provide an instant visual inventory of a portion of the earth's surface and can be used to create detailed maps. Aerial photographs commonly are taken by commercial aerial photography firms which own and operate specially modified aircraft equipped with mapping quality cameras. Camera and platform configurations can be grouped in terms of oblique and vertical. Oblique aerial photography is taken at an angle to the ground. The resulting images give a view as if the observer is looking out an airplane window. These images are easier to interpret than vertical photographs, but it is difficult to locate and measure features on them for mapping purposes. 42Compiled by : Dr. Zubairul Islam
  • 43. Vertical aerial photography is taken with the camera pointed straight down. The resulting images depict ground features in plan form and are easily compared with maps. Vertical aerial photos are always highly desirable, but are particularly useful for resource surveys in areas where no maps are available. Aerial photos depict features such as field patterns and vegetation which are often omitted on maps. Comparison of old and new aerial photos can also capture changes within an area over time. Vertical aerial photos contain subtle displacements due to relief, tip and tilt of the aircraft and lens distortion. Vertical images may be taken with overlap, typically about 60 percent along the flight line and at least 20 percent between lines. Overlapping images can be viewed with a stereoscope to create a three-dimensional view, called a stereo model. 43Compiled by : Dr. Zubairul Islam
  • 44. 3.1.2. Space borne remote sensing Photography has proven to be an important input to visual interpretation and the production of analog maps. However, the development of satellite platforms, the associated need to telemeter imagery in digital form, and the desire for highly consistent digital imagery have given rise to the development of solid state scanners as a major format for the capture of remotely sensed data. The basic logic of a scanning sensor is the use of a mechanism to sweep a small field of view (known as an instantaneous field of view—IFOV) in a west to east direction at the same time the satellite is moving in a north to south direction. Together this movement provides the means of composing a complete raster image of the environment. A simple scanning technique is to use a rotating mirror that can sweep the field of view in a consistent west to east fashion. The field of view is then intercepted with a prism that can spread the energy contained within the IFOV into its spectral components. Photoelectric detectors (of the same nature as those found in the exposure meters of commonly available photographic cameras) are then arranged in the path of this spectrum to provide electrical measurements of the amount of energy detected in various parts of the electromagnetic spectrum. 44Compiled by : Dr. Zubairul Islam
  • 45. As the scan moves from west to east, these detectors are polled to get a set of readings along the east-west scan. These form the columns along one row of a set of raster images—one for each detector. Movement of the satellite from north to south then positions the system to detect the next row, ultimately leading to the production of a set of raster images as a record of reflectance over a range of spectral bands. There are several satellite systems in operation today that collect imagery that is subsequently distributed to users. Several of the most common systems are described below. Each type of satellite data offers specific characteristics that make it more or less appropriate for a particular application. 45Compiled by : Dr. Zubairul Islam
  • 46. 3.2 Satellite Characteristics: Orbits and Swaths Satellites have several unique characteristics which make them particularly useful for remote sensing of the Earth's surface. The path followed by a satellite is referred to as its orbit. Satellite orbits are matched to the capability and objective of the sensor(s) they carry. Orbit selection can vary in terms of altitude (their height above the Earth's surface) and their orientation and rotation relative to the Earth. Geostationary orbits Satellites at very high altitudes, which view the same portion of the Earth's surface at all times have geostationary orbits. These geostationary satellites revolve at speeds which match the rotation of the Earth so they seem stationary, relative to the Earth's surface. This allows the satellites to observe and collect information continuously over specific areas. Weather and communications satellites commonly have these types of orbits. Due to their high altitude, some geostationary weather satellites can monitor weather and cloud patterns covering an entire hemisphere of the Earth. 46Compiled by : Dr. Zubairul Islam
  • 47. Sun-synchronous Many remote sensing platforms are designed to follow an orbit (basically north-south) which, in conjunction with the Earth's rotation (west-east), allows them to cover most of the Earth's surface over a certain period of time. These are nearpolar orbits, so named for the inclination of the orbit relative to a line running between the North and South poles. These satellite orbits are known as sun-synchronous. 47Compiled by : Dr. Zubairul Islam
  • 48. As a satellite revolves around the Earth, the sensor "sees" a certain portion of the Earth's surface. The area imaged on the surface, is referred to as the swath. Imaging swaths for space borne sensors generally vary between tens and hundreds of kilometers wide. As the satellite orbits the Earth from pole to pole, its east-west position wouldn't change if the Earth didn't rotate. However, as seen from the Earth, it seems that the satellite is shifting westward because the Earth is rotating (from west to east) beneath it. This apparent movement allows the satellite swath to cover a new area with each consecutive pass. The satellite's orbit and the rotation of the Earth work together to allow complete coverage of the Earth's surface, after it has completed one complete cycle of orbits. Swath 48Compiled by : Dr. Zubairul Islam
  • 49. 49Compiled by : Dr. Zubairul Islam
  • 50. 3.3. Satellite Image data characteristics Image data characteristics can be explained under three categories as follows: 1. The spatial resolution, 2. Spectral resolution 3. Radiometric resolution 4. Temporal resolution 50Compiled by : Dr. Zubairul Islam
  • 51. 3.3.1- Spatial Resolution, Pixel size and scale The detail discernible in an image is dependent on the spatial resolution of the sensor and refers to the size of the smallest possible feature that can be detected. Spatial resolution of passive sensors (we will look at the special case of active microwave sensors later) depends primarily on their Instantaneous Field of View (IFOV). The IFOV is the angular cone of visibility of the sensor (A) and determines the area on the Earth's surface which is "seen" from a given altitude at one particular moment in time (B). The size of the area viewed is determined by multiplying the IFOV by the distance from the ground to the sensor (C). This area on the ground is called the resolution cell and determines a sensor's maximum spatial resolution. 51Compiled by : Dr. Zubairul Islam
  • 52. For a homogeneous feature to be detected, its size generally has to be equal to or larger than the resolution cell. If the feature is smaller than this, it may not be detectable as the average brightness of all features in that resolution cell will be recorded. However, smaller features may sometimes be detectable if their reflectance dominates within a particular resolution cell allowing sub-pixel or resolution cell detection. Most remote sensing images are composed of a matrix of picture elements, or pixels, which are the smallest units of an image. Image pixels are normally square and represent a certain area on an image. It is important to distinguish between pixel size and spatial resolution - they are not interchangeable. If a sensor has a spatial resolution of 20 metres and an image from that sensor is displayed at full resolution, each pixel represents an area of 20m x 20m on the ground. In this case the pixel size and resolution are the same. However, it is possible to display an image with a pixel size different than the resolution. PIXEL 52Compiled by : Dr. Zubairul Islam
  • 53. Images where only large features are visible are said to have coarse or low resolution. In fine or high resolution images, small objects can be detected. Military sensors for example, are designed to view as much detail as possible, and therefore have very fine resolution. Commercial satellites provide imagery with resolutions varying from a few metres to several kilometers. Generally speaking, the finer the resolution, the less total ground area can be seen. The ratio of distance on an image or map, to actual ground distance is referred to as scale. If you had a map with a scale of 1:100,000, an object of 1cm length on the map would actually be an object 100,000cm (1km) long on the ground. Maps or images with small "map-to-ground ratios" are referred to as small scale (e.g. 1:100,000), and those with larger ratios (e.g. 1:5,000) are called large scale. SCALE 53Compiled by : Dr. Zubairul Islam
  • 54. 3.3.2- Spectral Resolution Different classes of features and details in an image can often be distinguished by comparing their responses over distinct wavelength ranges. Broad classes, such as water and vegetation, can usually be separated using very broad wavelength ranges - the visible and near infrared other more specific classes, such as different rock types, may not be easily distinguishable using either of these broad wavelength ranges and would require comparison at much finer wavelength ranges to separate them. Thus, we would require a sensor with higher spectral resolution. Spectral resolution describes the ability of a sensor to define fine wavelength intervals. The finer the spectral resolution, the narrower the wavelength ranges for a particular channel or band. 54Compiled by : Dr. Zubairul Islam
  • 55. Black and white film records wavelengths extending over much, or all of the visible portion of the electromagnetic spectrum. Its spectral resolution is fairly coarse, as the various wavelengths of the visible spectrum are not individually distinguished and the overall reflectance in the entire visible portion is recorded. Color film is also sensitive to the reflected energy over the visible portion of the spectrum, but has higher spectral resolution, as it is individually sensitive to the reflected energy at the blue, green, and red wavelengths of the spectrum. Thus, it can represent features of various colors based on their reflectance in each of these distinct wavelength ranges. 55Compiled by : Dr. Zubairul Islam
  • 56. Many remote sensing systems record energy over several separate wavelength ranges at various spectral resolutions. These are referred to as multi-spectral sensors and will be described in some detail in following sections. Advanced multi-spectral sensors called hyperspectral sensors, detect hundreds of very narrow spectral bands throughout the visible, near-infrared, and mid-infrared portions of the electromagnetic spectrum. Their very high spectral resolution facilitates fine discrimination between different targets based on their spectral response in each of the narrow bands. 56Compiled by : Dr. Zubairul Islam
  • 57. 3.3.3- Radiometric Resolution The radiometric characteristics describe the actual information content in an image. Every time an image is acquired on film or by a sensor, its sensitivity to the magnitude of the electromagnetic energy determines the radiometric resolution. The radiometric resolution of an imaging system describes its ability to discriminate very slight differences in energy the finer the radiometric resolution of a sensor, the more sensitive it is to detecting small differences in reflected or emitted energy. 57Compiled by : Dr. Zubairul Islam
  • 58. Imagery data are represented by positive digital numbers which vary from 0 to (one less than) a selected power of 2. This range corresponds to the number of bits used for coding numbers in binary format. Each bit records an exponent of power 2 (e.g. 1 bit=2 1=2). The maximum number of brightness levels available depends on the number of bits used in representing the energy recorded. Thus, if a sensor used 8 bits to record the data, there would be 28=256 digital values available, ranging from 0 to 255. However, if only 4 bits were used, then only 24=16 values ranging from 0 to 15 would be available. Thus, the radiometric resolution would be much less. Image data are generally displayed in a range of grey tones, with black representing a digital number of 0 and white representing the maximum value (for example, 255 in 8-bit data). By comparing a 2-bit image with an 8-bit image, we can see that there is a large difference in the level of detail discernible depending on their radiometric resolutions. 58Compiled by : Dr. Zubairul Islam
  • 59. In addition to spatial, spectral, and radiometric resolution, the concept of temporal resolution is also important to consider in a remote sensing system. The revisit period of a satellite sensor is usually several days. Therefore the absolute temporal resolution of a remote sensing system to image the exact same area at the same viewing angle a second time is equal to this period. The time factor in imaging is important when: • Persistent clouds offer limited clear views of the Earth's surface (often in the tropics) • Short-lived phenomena (floods, oil slicks, etc.) need to be imaged • Multi-temporal comparisons are required (e.g. the spread of a forest disease from one year to the next) • The changing appearance of a feature over time can be used to distinguish it from near-similar features (wheat / maize) 3.3.4- Temporal Resolution 59Compiled by : Dr. Zubairul Islam
  • 60. 3.4. Image Specifications of Multispectral Satellite Sensors The Landsat program is the longest running enterprise for acquisition of satellite imagery of Earth. On July 23, 1972 the Earth Resources Technology Satellite was launched. This was eventually renamed to Landsat. The most recent, Landsat 8, was launched on February 11, 2013. Landsat sensors collected data over a swath width of 185 km, with a full scene being defined as 185 km x 185 km. The instruments on the Landsat satellites have acquired millions of images. The images, archived in the United States and at Landsat receiving stations around the world, are a unique resource for global change research and applications in agriculture, cartography, geology, forestry, regional planning, surveillance and education, and can be viewed through the USGS 'Earth Explorer' website. Landsat 60Compiled by : Dr. Zubairul Islam
  • 61. Landsat: Spectral Bands Characteristics Ba nd EMS * About 1 Blue light scattered by the atmosphere and illuminates material in shadows better than longer wavelengths; penetrates clear water better than other colors; absorbed by chlorophyll, so plants don’t show up very brightly in this band; useful for soil/vegetation discrimination, forest type mapping, and identifying man-made features 2 Green light penetrates clear water fairly well, gives excellent contrast between clear and turbid (muddy) water; helps find oil on the surface of water, and vegetation (plant life); reflects more green light than any other visible color; man-made features are still visible 3 Red light limited water penetration; reflects well from dead foliage, but not well from live foliage with chlorophyll; useful for identifying vegetation types, soils, and urban (city and town) features 4 Near IR (NIR) good for mapping shorelines and biomass content; very good at detecting and analyzing vegetation 5 Shortwave IR (SWIR) limited cloud penetration; provides good contrast between different types of vegetation; useful for measuring the moisture content of soil and vegetation; helps differentiate between snow and clouds 6 Thermal IR (TIR or LWIR) useful to observe temperature and its effects, such as daily and seasonal variations; useful to identify some vegetation density, moisture, and cover type; ETM+ TIR has 60-meter pixels; TIR pixels on Landsat-5 are 120 meters 7 Another SWIR limited cloud penetration; provides good contrast between different types of vegetation; useful for measuring the moisture content of soil and vegetation; helps differentiate between snow and clouds 8 Panchromatic (“pan”) on Landsat 7 only, has 15 m resolution, used to “sharpen” images * Electro magnetic spectrum 61Compiled by : Dr. Zubairul Islam
  • 62. Data Product Provider Orbital Height Spatial Resolution Swath Width Pass Over Time Date Range of Acquisition Spectral Coverage Data Use Landsat 5 TM NASA/ USGS 705 km 30 m 185 km Every 16 days since March 1, 1984 BAND 1: 0.45-0.52 µm (30 m) oceanography, aerosols, bathymetry, vegetation types, peak vegetation, biomass content analysis, moisture analysis, thermal mapping, mineral deposit identification Sun- synchro nous 120 m Equator at ~09h45 (local time) Note: First Landsat Mission in 1972 BAND 2: 0.52-0.60 µm (30 m) BAND 3: 0.63-0.69 µm (30 m) BAND 4: 0.76-0.90 µm (30 m) BAND 5: 1.55-1.75 µm (30 m) BAND 6: 10.4-12.5 µm (120 m) (IR) BAND 7: 08-2.35 µm (30 m) Landsat 5 TM : Specifications 62Compiled by : Dr. Zubairul Islam
  • 63. Data Product Provider Orbital Height Spatial Resolution Swath Width Pass Over Time Date Range of Acquisition Spectral Coverage Data Use Landsat 7 ETM+ NASA/USGS 705 km 15 m 183 km Every 16 days since April 15, 1999 B1: 0.45-0.515 µm (30 m) oceanography, aerosols, bathymetry, vegetation types, peak vegetation, biomass content analysis, moisture analysis, thermal mapping, mineral deposit identification Sun- synchronou s 30 m Equator at ~10h00 (local time) B2: 0.525-0.605 µm (30 m) 60 m B3: 0.63-0.69 µm (30 m) B4: 0.75-0.90 µm (30 m) B5: 1.55-1.75 µm (30 m) B6: 10.4-12.5 µm (60 m) B7: 2.09-2.35 µm (30 m) B8: 0.52-0.9 µm ( 15 m) Landsat 7 ETM+ - Specifications PLS NOTE : B1: Band 1, B2: Band 2, B3: Band 3, B4: Band 4, B5: Band 5, B6: Band 6, B7: Band 7, 63Compiled by : Dr. Zubairul Islam
  • 64. Data ProductProvider Orbital Height Spatial Resolution Swath Width Pass Over Time Date Range of Acquisition Spectral Coverage Data Use Landsat 8 NASA/USGS 705 km 15 m 185 km Every 16 days since B1: 0.433–0.453 µm (30 m) oceanography, aerosols, bathymetry, vegetation types, peak vegetation, biomass content analysis, moisture analysis, cloud cover analysis, thermal mapping, soil moisture estimation OLI (B1-B9) Sun- synchronous 30 m Equator at ~10h00 (LT) 11-Feb-13 B2: 0.450–0.515 µm (30 m) TIRS (B10-B11) 60 m B3: 0.525–0.600 µm (30 m) 100 m B4: 0.630–0.680 µm (30 m) B5: 0.845–0.885 µm (30 m) B6: 1.560–1.660 µm (60 m) B7: 2.100–2.300 µm (30 m) B8: 0.500–0.680 µm (15 m) B9: 1.360–1.390 µm (30 m) B10: 10.6-11.2 µm (100 m) (IR) B11: 11.5-12.5 µm (100 m) (IR) PLS NOTE : B1: Band 1, B2: Band 2, B3: Band 3, B4: Band 4, B5: Band 5, B6: Band 6, B7: Band 7, B8: Band 8 Landsat 8 - Specifications 64Compiled by : Dr. Zubairul Islam
  • 65. 65Compiled by : Dr. Zubairul Islam
  • 66. Data Product Provider Orbital Height Spatial Resolution Swath Width Pass Over Time Date Range of Acquisition Spectral Coverage Data Use QuickBird Digital Globe 482 km 65 cm B/W 16.8 km – Every 2.4- 5.9 days, equator at 10h30 (local time) since October 18, 2001 B/W: 405- 1053 nm mapping, change detection, planning (engineering, natural resources, urban, infrastructure) , land-use, EIA, tourism, military, crop management, environmental monitoring 450 km 2.62 m RGBiR 18 km R: 430 - 545 nm Sun- synchron ous 61 cm B/W G: 466 - 620 nm 2.44 m RGBiR B: 590 - 710 nm NIR: 715 - 918 nm QuickBird 66Compiled by : Dr. Zubairul Islam
  • 67. Data Product Provider Orbital Height Spatial Resolution Swath Width Pass Over Time Date Range of Acquisition Spectral Coverage Data Use Ikonos Digital Globe 681 km 80 cm B/W 11.3 km every 3 days launch September 24, 1999 B/W: 445- 900 nm mapping, change detection, planning (engineering, natural resources, urban, infrastructure), land-use, EIA, tourism, military, crop management, environmental monitoring Sun- synchron ous 3.2 m RBGiR R: 445- 516 nm B: 506- 595 nm G: 632- 698 nm NiR: 757- 853 nm Ikonos 67Compiled by : Dr. Zubairul Islam
  • 68. Chapter – 4 - Digital image processing 4.1 - Raster Bands 4.2 - Software Demonstration: Arc GIS 9.3 4.3 – General Exercises Lab 4.3.1 – Add Raster Data into ArcMap Lab 4.3.2 - Clip image using ArcGIS and Spatial Analyst 4.4 – Radiometric Corrections Lab 4.4 .1 - Conversion of DN Values to reflectance 4.5 – Band Combinations Lab 4.5.1 - Band Combinations with Arc Map 4.6 – Image Classification Lab 4.6.1 - Image Classification 4.7 – Land Use map Lab 4.7.1 - Create land use map 68Compiled by : Dr. Zubairul Islam
  • 69. 4.1 - Raster bands rasters may be single band or multiple bands. An example of a single-band raster dataset is a digital elevation model (DEM). Each cell in a DEM contains only one value representing surface elevation. Most satellite imagery has multiple bands, typically containing values within a range or band of the electromagnetic spectrum. There are three main ways to display single-band raster datasets: 1. Using two colors—In a binary image, each cell has a value of 0 or 1 and is often displayed using black and white. This type of display is often used for displaying scanned maps with simple line work, such as parcel maps. 2. Grayscale—In a grayscale image, each cell has a value from 0 to another number, such as 255 or 65535. These are often used for black-and-white aerial photographs. 3. Color map—One way to represent colors on an image is with a color map. A set of values is coded to match a defined set of red, green, and blue (RGB) values. 69Compiled by : Dr. Zubairul Islam
  • 70. 70Compiled by : Dr. Zubairul Islam
  • 71. When there are multiple bands, every cell location has more than one value associated with it. With multiple bands, each band usually represents a segment of the electromagnetic spectrum collected by a sensor. Bands can represent any portion of the electromagnetic spectrum, including ranges not visible to the eye, such as the infrared or ultraviolet sections. The term band originated from the reference to the color band on the electromagnetic spectrum. When you create a map layer from a raster image, you can choose to display a single band of data or form a color composite from multiple bands. A combination of any three of the available bands in a multiband raster dataset can be used to create RGB composites. By displaying bands together as RGB composites, often more information is gleaned from the dataset than if you were to work with just one band. 71Compiled by : Dr. Zubairul Islam
  • 72. A satellite image, for example, commonly has multiple bands representing different wavelengths from the ultraviolet through the visible and infrared portions of the electromagnetic spectrum. Landsat imagery, for example, is data collected from seven different bands of the electromagnetic spectrum. Bands 1–7, including 6, represent data from the visible, near infrared, and midinfrared regions. Band 6 collects data from the thermal infrared region. Another example of a multiband image is a true color orthophoto in which there are three bands, each representing either red, green, or blue light. 72Compiled by : Dr. Zubairul Islam
  • 73. PIXEL VALUES A raster consists of a matrix of cells (or pixels) organized into rows and columns (or a grid) where each cell contains a value representing information. Rasters are digital aerial photographs, imagery from satellites, digital pictures, or even scanned maps. USES OF RASTER DATA Structure of raster data is simple, it is exceptionally useful for a wide range of applications. Within a GIS, the uses of raster data fall under four main categories: Rasters as basemaps A common use of raster data in a GIS is as a background display for other feature layers. Below is a raster used as a basemap for road data. 73Compiled by : Dr. Zubairul Islam
  • 74. Rasters as surface maps Rasters are well suited for representing data that changes continuously across a landscape (surface). Elevation values measured from the earth's surface are the most common application of surface maps, but other values, such as rainfall, temperature, concentration, and population density, can also define surfaces that can be spatially analyzed. The raster below displays elevation—using green to show lower elevation and red, pink, and white cells to show higher elevation. 74Compiled by : Dr. Zubairul Islam
  • 75. Rasters as thematic maps Rasters representing thematic data can be derived from analyzing other data. A common analysis application is classifying a satellite image by land-cover categories. Basically, this activity groups the values of multispectral data into classes (such as vegetation type) and assigns a categorical value. Below is an example of a classified raster dataset showing land use. 75Compiled by : Dr. Zubairul Islam
  • 76. General characteristics of raster data In raster datasets, each cell (which is also known as a pixel) has a value. The cell values represent the phenomenon. The category could be a land-use class such as grassland, forest, or road. A magnitude might represent gravity, noise pollution, or percent rainfall. Height could represent surface elevation above mean sea level, which can be used to derive slope, aspect, and watershed properties. Spectral values are used in satellite imagery and aerial photography to represent light reflectance and color. Cell values can be either positive or negative, integer, or floating point. Integer values are best used to represent categorical (discrete) data, and floating- point values to represent continuous surfaces. Cells can also have a NoData value to represent the absence of data. 76Compiled by : Dr. Zubairul Islam
  • 77. The area (or surface) represented by each cell consists of the same width and height and is an equal portion of the entire surface represented by the raster. For example, a raster representing elevation (that is, digital elevation model) may cover an area of 100 square kilometers. If there were 100 cells in this raster, each cell would represent one square kilometer of equal width and height (that is, 1 km x 1 km). 77Compiled by : Dr. Zubairul Islam
  • 78. The dimension of the cells can be as large or as small as needed to represent the surface conveyed by the raster dataset and the features within the surface, such as a square kilometer, square foot, or even a square centimeter. The cell size determines how coarse or fine the patterns or features in the raster will appear. The smaller the cell size, the smoother or more detailed the raster will be. However, the greater the number of cells, the longer it will take to process, and it will increase the demand for storage space. If a cell size is too large, information may be lost or subtle patterns may be obscured. For example, if the cell size is larger than the width of a road, the road may not exist within the raster dataset. In the diagram below, you can see how this simple polygon feature will be represented by a raster dataset at various cell sizes. 78Compiled by : Dr. Zubairul Islam
  • 79. 4.2 - SOFTWARE DEMONSTRATION An overview of ArcMap ArcMap is where you display and explore the datasets for your study area, where you assign symbols, and where you create map layouts for printing or publication. ArcMap is also the application you use to create and edit datasets. ArcMap represents geographic information as a collection of layers and other elements in a map. Common map elements include the data frame containing map layers for a given extent plus a scale bar, north arrow, title, descriptive text, a symbol legend, and so on. ArcMap documents When you save a map you have created in ArcMap, it will be saved as a file on disk. A filename extension (.mxd) will be automatically appended to your map document name. You can work with an existing .mxd by double-clicking the document to open it. This will start an ArcMap session for that .mxd. Map documents contain display properties of the geographic information you work with in the map—such as the properties and definitions of your map layers, data frames, and the map layout for printing—plus any optional customizations and macros that you add to your map. 79Compiled by : Dr. Zubairul Islam
  • 80. Views in ArcMap ArcMap displays map contents in one of two views: 1. Data view 2. Layout view Each view lets you look at and interact with the map in a specific way. In data view, the active data frame is presented as a geographic window in which map layers are displayed and used. 80Compiled by : Dr. Zubairul Islam
  • 81. Layout view is used to design and author a map for printing, exporting, or publishing. You can manage map elements within the page space (typically, inches or centimeters), add new map elements, and preview what your map will look like before exporting or printing it. Common map elements include: data frames with map layers, scale bars, north arrows, symbol legends, map titles, text, and other graphical elements. 81Compiled by : Dr. Zubairul Islam
  • 82. LAB 4.3.1 – Add Raster Data into ArcMap 1. Open ArcMap- it is one of several programs within the package titled ArcGIS. Once the program is open it will prompt you with this window: 2. Make sure “A new empty map” is selected and click “Ok.” 4.3 – GENERAL EXERCISES 82Compiled by : Dr. Zubairul Islam
  • 83. 3. To add data to your map, click on the “Add Data” icon. Navigate to the folder and one of the Landsat scenes you are interested in mosaicking and click “Add.” 4. When ArcMap prompts you with this window: Click “Yes” and wait for the few moments it takes to display the data (A progress bar usually appears in the lower right-hand corner of the screen) 5. Repeat steps 3 and 4 for the other scenes you wish . 83Compiled by : Dr. Zubairul Islam
  • 84. LAB 4.3.2 - Clip image using ArcGis STEPS 1. Add data – Raster + Vector 2. Data management tool 3. Rater 4. Rater processing 5. Clip 6. Input Rater 7. Output extent 8. Use input feature for clipping geometry 9. Output raster data 10. No data value - 0 11. OK 12. Result Result Note – Repeat the same exercise for band 2,3,4,5. 84Compiled by : Dr. Zubairul Islam
  • 85. 4.4 - Radiometric correction Radiometric correction is to avoid radiometric errors or distortions. When reflected electro-magnetic energy is observed by a sensor, the observed energy does not match with the energy reflected from the same object observed from a short distance. This is due to the sun's azimuth and elevation, atmospheric conditions such as fog or aerosols, sensor's response etc. which influence the observed energy. Therefore, in order to obtain the real reflectance, those radiometric distortions must be corrected. 85Compiled by : Dr. Zubairul Islam
  • 86. Radiometric correction involve change of DN values to radience or reflectance. Where as Radiance is the intensity of radiation leaving ground to some direction, its unit of measurement is W/m2/sr. Reflectance is the ratio between Radiance & incoming irradiance. 86Compiled by : Dr. Zubairul Islam
  • 87. Radiometric correction of Landsat 8 Data The standard Landsat 8 products provided by the USGS EROS Center consist of quantized and calibrated scaled Digital Numbers (DN) representing multispectral image data acquired by both the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS). Background The products are delivered in 16-bit unsigned integer format and can be rescaled to the Top Of Atmosphere (TOA) reflectance and/or radiance using radiometric rescaling coefficients provided in the product metadata file (MTL file), as briefly described below. 87Compiled by : Dr. Zubairul Islam
  • 88. Conversion to TOA Radiance OLI and TIRS band data can be converted to TOA spectral radiance using the radiance rescaling factors provided in the metadata file: Lλ = MLQcal + AL ……. Eq. 01 where: Lλ = TOA spectral radiance (Watts/( m2 * srad * μm)) ML = Band-specific multiplicative rescaling factor from the metadata (RADIANCE_MULT_BAND_x, where x is the band number) AL = Band-specific additive rescaling factor from the metadata (RADIANCE_ADD_BAND_x, where x is the band number) Qcal = Quantized and calibrated standard product pixel values (DN) 88Compiled by : Dr. Zubairul Islam
  • 89. Conversion to TOA Reflectance OLI band data can also be converted to TOA planetary reflectance using reflectance rescaling coefficients provided in the product metadata file (MTL file). The following equation is used to convert DN values to TOA reflectance for OLI data as follows: ρλ' = MρQcal + Aρ ……. Eq. 02 Where: ρλ' = TOA planetary reflectance, without correction for solar angle. Mρ = Band-specific multiplicative rescaling factor from the metadata (REFLECTANCE_MULT_BAND_x, where x is the band number) Aρ = Band-specific additive rescaling factor from the metadata (REFLECTANCE_ADD_BAND_x, where x is the band number) Qcal = Quantized and calibrated standard product pixel values (DN) 89Compiled by : Dr. Zubairul Islam
  • 90. TOA reflectance with a correction for the sun angle is then: where: ρλ = TOA planetary reflectance θSE = Local sun elevation angle. The scene center sun elevation angle in degrees is provided in the metadata (SUN_ELEVATION). ……. Eq. 03 So for reflectance formula may be used as. MρQcal + Aρ sin(θSE ) ρλ = 90Compiled by : Dr. Zubairul Islam
  • 91. LAB 4.4.1 - Landsat 8 data - conversion of DN values to reflectance with ArcGis 1. Add Data 2. Open Spatial Analyst 3. Rater Calculator 4. Give formula 5. Evaluate 6. Right click on calculation 7. Data 8. Make Permanent Save to your folder RESULT Note : Reflectance values varies 0 to + 1%, It varies from one material to another material. Repeat the same exercise for band 2,3,4,5. 91Compiled by : Dr. Zubairul Islam
  • 92. 4.5 - Band Combinations Landsat 8 measures different ranges of frequencies along the electromagnetic spectrum – a color, although not necessarily a color visible to the human eye. Each range is called a band, and Landsat 8 has 11 bands. Landsat numbers its red, green, & blue sensors as 4, 3, & 2, so when we combine them we get a true-color image such as this one: 92Compiled by : Dr. Zubairul Islam
  • 93. STEPS FOR CREATING TRUE COLOR COMBINATION OF IMAGES 1. Add Data 2. Data Management tool 3. Raster 4. Raster Processing 5. Combination Bands 6. Input Raster 7. Output Location 8. OK RESULT - Note : In this image green color will be for vegetation, Brown for barren and rocky land and blue for water. LAB 4.5 – Band Combinations of Images 93Compiled by : Dr. Zubairul Islam
  • 94. The goal of classification is to assign each cell in the study area to a known class (supervised classification) or to a cluster (unsupervised classification). In both cases, the input to classification is a signature file containing the multivariate statistics of each class or cluster. The result of classification is a map that partitions the study area into classes. Classifying locations into naturally occurring classes corresponding to clusters is also referred to as stratification. 4.6 – IMAGE CLASSIFICATION 94Compiled by : Dr. Zubairul Islam
  • 95. Unsupervised classifications In an unsupervised classification, you divide the study area into a specified number of statistical clusters. These clusters are then interpreted into meaningful classes. Unsupervised classification is done in two stages as follows: A. Creating Iso-cluster B. Creating Maximum likelihood classification A. Steps for Iso-cluster 1. Add data – Composite image 2. Spatial Analysis Tool 3. Multivariate 4. Iso-cluster 5. Input raster bands 6. Check output location 7. Number of classes 8. Ok LAB 4.6.1 - Unsupervised classifications 95Compiled by : Dr. Zubairul Islam
  • 96. B. Steps for Creating Maximum likelihood classification 1. Spatial Analysis Tool 2. Multivariate 3. Maximum likelihood classification 4. Input raster bands 5. Input signature file 6. Output classified Raster 7. Ok RESULT Note : Image has been divided into 3 clusters 96Compiled by : Dr. Zubairul Islam
  • 97. LAB 4.7.1 - Create land use map with classified raster image created in last exercise 1. Right click on classified image 2. Property 3. Symbology 4. Double click on colored box 5. Change land use type 6. Apply 7. OK 4.7 – Land use Map Land use map shows the types and intensities of different land uses in a particular area. For land use map the image classified in last exercise may be used as follows: This exercise comprises 2 steps First step Cont. to second step 97Compiled by : Dr. Zubairul Islam
  • 98. Second step 1. Click layout view 2. Use tools to set map at right place 3. Give title 4. legend (set Item) 5. Scale bar (Set Properties) 6. File – Export – Save RESULT Note – This map may be used for your work 98Compiled by : Dr. Zubairul Islam
  • 99. END OF CHAPTER 4 99Compiled by : Dr. Zubairul Islam
  • 100. CHAPTER FIVE GPS and remote sensing 5.1. Introduction 5.2. Satellite based positioning 5.3. General Functions of Garmin GPS – 60 5.4. Exercise – Create Waypoints 100Compiled by : Dr. Zubairul Islam
  • 101. • GPS and remote sensing imagery are primary GIS data sources, and are very important GIS data sources. • GPS data creates points (positions), polylines, or polygons for GIS • Remote sensing imagery are used as major basis map in GIS 5.1. Introduction 101Compiled by : Dr. Zubairul Islam
  • 102. 5.2. Satellite based positioning  GPS is a Satellite Navigation System  GPS is funded and controlled by the U. S. Department of Defense (DOD). While there are many thousands of civil users of GPS world-wide, the system was designed for and is operated by the U. S. military.  GPS provides specially coded satellite signals that can be processed in a GPS receiver, enabling the receiver to compute position, velocity and time.  At least 4 satellites are used to estimate 4 quantities: position in 3-D (X, Y, Z) and GPSing time (T)  The nominal GPS Operational Constellation consists of 24 satellites that orbit the earth. 102Compiled by : Dr. Zubairul Islam
  • 103. 5.3 - General Functions of Garmin GPS - 60 1. To turn the unit on/off press black button (hold down to turn off) 2. Use ROCKER button to move cursor up and down/left and right 3. To page through/between screens press PAGE/QUIT buttons 4. To find the menu press MENU button twice 5. Use ENTER key to select highlighted fields, enter data, or confirm on screen messages. 6. To cancel out of screen and not save press QUIT 7. Use MARK key to mark your current location as way point. 8. Use FIND key to view the find page. 9. To zoom in map screen press IN/OUT button 103Compiled by : Dr. Zubairul Islam
  • 104. •Press and release “Mark” •Edit name field (top) •“OK” 5.4 – Exercise to create waypoints 104Compiled by : Dr. Zubairul Islam
  • 105. 5.5 – Import Waypoints with Google earth 1. Tools 2. GPS 3. Import way points 4. Import Start Google earth & connect GPS with Your PC 105Compiled by : Dr. Zubairul Islam
  • 106. 106Compiled by : Dr. Zubairul Islam The End