APPLICATION OF REMOTE SENSING AND
GEOGRAPHICAL INFORMATION SYSTEM IN
CIVIL ENGINEERING
Date:
INSTRUCTOR
DR. MOHSIN SIDDIQUE
ASSIST. PROFESSOR
DEPARTMENT OF CIVIL ENGINEERING
Optical Remote Sensing
Optical remote sensing makes use of visible, near infrared and short-wave
infrared sensors to form images of the earth's surface by detecting the solar
radiation reflected from targets on the ground
Photography
(Photogrammetry)
Thermal Scanner
Multispectral
2
Optical remote sensing systems are classified into the following types,
depending on the number of spectral bands used in the imaging process.
Panchromatic imaging system
(1 bands)
Multispectral imaging system
(3-10 bands)
Superspectral imaging system
(10-50 bands)
Hyperspectral imaging system
(50-300 bands)
Optical Remote Sensing Systems
3
The sensor is a single channel detector sensitive to radiation within a broad
wavelength range.
If the wavelength range coincide with the visible range, then the resulting
image resembles a "black-and-white“ photograph taken from space.
The physical quantity being measured is the apparent brightness of the
targets and the spectral information or "colour" of the targets is lost.
Examples of panchromatic imaging systems are:
Worldview-1
IKONOS PAN
SPOT HRV-PAN
Panchromatic Imaging System
4
Panchromatic Imaging System
5
The sensor is a multichannel detector with a few spectral bands.
Each channel is sensitive to radiation within a narrow wavelength band.
The resulting image is a multilayer image which contains both the brightness
and spectral (colour) information of the targets being observed.
Examples of multispectral systems are;
ALOS AVNIR-2
Landsat MSS/TM/ETM
SPOT HRV-XS
IKONOS MS
Multispectral Imaging System
6
Multispectral Sensors
7
Landsat 7 ETM Technical Specification
8
A superspectral imaging sensor has many more spectral channels (typically
>10) than a multispectral sensor.
The bands have narrower bandwidths, enabling the finer spectral
characteristics of the targets to be captured by the sensor.
Examples of superspectral systems are:
MODIS
MERIS
Superspectral Imaging Systems
9
Orbit:705 km, 10:30 a.m.
descending node (Terra) or 1:30
p.m. ascending node (Aqua), sun-
synchronous, near-polar, circular
Scan Rate: 20.3 rpm, cross track
Swath Dimensions: 2330 km (cross
track) by 10 km (along track at
nadir)
Telescope:17.78 cm diameter
MODIS Specifications
Size:1.0 x 1.6 x 1.0 m, Weight: 228.7 kg, Power:162.5 W
Data Rate: 10.6 Mbps (peak daytime); 6.1 Mbps (orbital average)
Quantization:12 bits
Spatial Resolution:250 m (bands 1-2), 500 m (bands 3-7) and 1000 m
(bands 8-36)
Design Life: 6 years
10
Band
Wavelength
(nm)
Resolution
(m)
Primary
Use
1 620–670 250m Land/Cloud/
Aerosols
Boundaries2 841–876 250m
3 459–479 500m
Land/Cloud/
Aerosols
Properties
4 545–565 500m
5 1230–1250 500m
6 1628–1652 500m
7 2105–2155 500m
8 405–420 1000m
Ocean Color/
Phytoplankton
/
Biogeochemist
ry
9 438–448 1000m
10 483–493 1000m
11 526–536 1000m
12 546–556 1000m
13 662–672 1000m
14 673–683 1000m
15 743–753 1000m
16 862–877 1000m
17 890–920 1000m
Atmospheric
Water Vapor
18 931–941 1000m
19 915–965 1000m
MODIS specifications
Band
Wavelength
(µm)
Resolution
(m)
Primary Use
20 3.660–3.840 1000m
Surface/Cloud
Temperature
21 3.929–3.989 1000m
22 3.929–3.989 1000m
23 4.020–4.080 1000m
24 4.433–4.498 1000m Atmospheric
Temperature25 4.482–4.549 1000m
26 1.360–1.390 1000m
Cirrus Clouds
Water Vapor
27 6.535–6.895 1000m
28 7.175–7.475 1000m
29 8.400–8.700 1000m
Cloud
Properties
30 9.580–9.880 1000m Ozone
31 10.780–11.280 1000m
Surface/Cloud
Temperature
32 11.770–12.270 1000m
33 13.185–13.485 1000m
Cloud Top
Altitude
34 13.485–13.785 1000m
35 13.785–14.085 1000m
36 14.085–14.385 1000m
11
It is also known as an "imaging spectrometer". it acquires images in about a
hundred or more contiguous spectral bands.
The precise spectral information contained in a hyperspectral image enables
better characterisation and identification of targets.
Hyperspectral images have potential applications in such fields as precision
agriculture (e.g. monitoring the types, health, moisture status and maturity of
crops), coastal management (e.g. monitoring of phytoplanktons, pollution,
bathymetry changes).
Examples of a hyperspectral system are:
Hyperion on EO1 satellite
AVIRIS (Airborne Visible and Infrared Spectrometer)
Hyperspectral Imaging Systems
12
AVIRIS Image Cube
13
Hyperspectral Image Cube
The science of making measurements from photographs is called
photogrammetry
Evolution of photography
Development of new photographic techniques and equipment
Development of new platforms for collection of imagery Black and white
photography
Color photography
Color infrared photography
Photogrammetry
15
How Do We Get Image
16
What is an Image
17
A 3000-row by 3000-column satellite image has three spectral
channels. If each pixel is represented by 8 bits (1 byte) per channel,
how many byte of computer memory are required to store the image?
Digital Format of Image
Pixels, are the smallest units of an image
18
Digital Format of Image
2D Image
19
Digital Format of Image
3D Image
20
Height Measurement using Parallelex
21
Anaglyph image is a method of encoding a three-dimensional image in a
single picture by superimposing a pair of pictures
The left image has the blue and green color channels removed to leave a
purely red picture while the right image has the red channel removed.
The two images are superimposed into one picture which produces a picture
very like the original with a red and cyan fringes around objects where the
stereo separation produces differences in the original images.
Stereo Viewing - Anaglyph
The concept of Stereo imaging is used in 3D movies !!
22
A common type of solid-state detector
in current use is the charge coupled
device (CCD).
At a specific pixel location, the CCD
element is exposed to incident light
energy and it builds up an electric
charge proportional to the intensity of
the incident light.
The electric charge is subsequently
amplified and converted from analog
to digital form.
A large number of CCDs can be
combined on a silicon chip in a one or
two dimensional array.
Digital Imaging Device
23
Digital Imaging Device
24
Scale of Vertical Photograph
The ratio of distance on an image or map, to actual ground distance is referred to
as scale.
25
Geometry of Vertical Photograph
(Nadir)
26
Types of distortion include:
Atmospheric refraction of light rays
Image motion or camera shake
Lens distortion (interior orientation)
Types of displacement include:
Curvature of the Earth
Tilt (exterior orientation)
Topographic or relief (including object height)
Types of Distortion and Displacement
27
Since the atmosphere density
decrease at higher altitude, light
rays do not travel in straight lines
through the atmosphere.
They are bent according to Snell’s
law.
A photogrammetric equations
assume that light rays travel in
straight paths, and to compensate
for the known refracted paths,
corrections are applied to the
image coordinates.
Atmospheric Refraction
28
Small effects due to the flaws in the
optical components (i.e. lens) of
camera systems leading to
distortions.
They are typically more serious at
the edges of photos.
These effects are radial from the
principal point (making objects
appear either closer to, or farther
from the principal point than they
actually are); and may be
corrected using calibration curves.
Lens Distortions (Interior Orientation)
29
The geoid is an equipotential gravity surface, which is considered to be mean
sea level.
A reference ellipsoid is a mathematically defined surface which approximates
the geoid globally.
Curvature of Earth
30
Definition of a Reference Ellipsoid
31
a is the semi-major axis of the
ellipse, and is identified as the
equatorial radius of the ellipsoid
b is the semi-minor axis of the
ellipse, and is identified with
the polar distances (from the
centre)
These two lengths completely
specify the shape of the ellipsoid
but in practice geodesy
publications classify reference
ellipsoids by giving the semi-
major axis and
the inverse flattening, 1/f.
The flattening, f, is simply a
measure of how much the
symmetry axis is compressed
relative to the equatorial radius:
WGS84: World Geodetic System 1984
GRS80: Geodetic Reference System 1980
All photos have some tilt and the
perfect gyro stabilization unit, like the
perfect lens, has yet to be built.
A tilted photograph presents a
slightly oblique view rather than a
true vertical record.
Tilt is caused by the rotation of the
platform away from the vertical.
If the amount and direction of tilt are
known then the photo may be
rectified.
Tilt Displacement
Camera orientation for photographs
(Exterior orientation)
32
Collinearity equation is a physical model representing the geometry between
a sensor (projection center), the ground coordinates of an object and the
image
Geometry between Image and Ground
33
Rotation matrix of ground coordinate P(X, Y, Z) gives image coordinate P(up,
vp, wp)
Collinearity Equation (cont’d)
34
This is typically the most serious type of displacement.
This displacement radiates outward from Nadir.
This is caused by the perspective geometry of the camera and the terrain at
varying elevations.
This is used for three measurements;
Stereo viewing (anaglyph)
Height measurement
Topographic mapping (LiDAR)
Topographic Displacement
35
Relief displacement measurement on a single vertical photograph
Stereoscopic measurement based on the parallax of two photographs
Types of height measurement
Height measurement with
stereoscopic measurement
Relief displacement
36
Relief Displacement
37
Relief Displacement
The effect of relief does not only cause a change in the scale but can also be
considered as a component of image displacement.
The distance d between the two photo points is called relief displacement
because it is caused by the elevation difference h between A and A’
38
This is the most used method of
measuring heights on air photos.
There are many forms of the parallax
equations.
This corresponds to the distance
between image points, of the same
object on the ground, on the left and
right image.
The height difference can be
computed if the parallax difference
is measured between two points of
different height, using a parallax bar.
Parallax Height Method: Stereoscopic Parallax
(Epipolar Geometry)39
Assignment: Derive the expression for h
Orthorectification is the process by which the
geometric distortions of the image are
modeled and accounted for.
The orthorecifitication process yields map-
accurate images which can be highly useful
as base maps and may be easily
incorporated into a GIS.
The success of the orthorectification process
depends on the accuracy of the digital
elevation model (DEM) and the correction
formulae.
Orthorectification
40
Flight Panning
41
Digital Image Matching
42
Correlation Coefficient Computing
43
Assignment: Compute the correlation coefficient, c
A digital elevation model (DEM) is defined as a file or database containing
elevation points over a contiguous area.
DEMs may be subdivided into:
Digital surface models (DSM) that contain elevation information about all
features in the landscape, such as vegetation, buildings, and other
structures;
Digital terrain models (DTM) that contain elevation information about the
bare-Earth surface without the influence of vegetation or man-made
structures.
Four major technologies are used to obtain elevation information
In situ surveying
Photogrammetry
Interferometric Synthetic Aperture Radar (IFSAR)
Light Detection and Ranging (LiDAR)
Digital Elevation Models
44
Digital Elevation Models
45
Extracting terrain parameters
Modeling water flow or mass movement
(e.g., avalanches and landslides)
Creation of relief maps
Rendering of 3D visualizations.
3d flight planning
Creation of physical models
(including raised-relief maps)
Rectification of aerial
photography or satellite imagery.
Reduction (terrain correction)
of gravity measurements
(gravimetry, physical geodesy).
Terrain analyses
in geomorphology and physical
geography
Uses of Digital Elevation Models
Geographic Information Systems (GIS)
Engineering and infrastructure design
Global positioning systems (GPS)
Line-of-sight analysis
Base mapping
Flight simulation
Precision farming and forestry
Surface analysis
Intelligent transportation systems (ITS)
Auto safety / Advanced Driver
Assistance Systems (ADAS)
Archaeology
46
Comments….
Questions….
Suggestions….
47
I am greatly thankful to all the information sources
(regarding remote sensing and GIS) on internet that I
accessed and utilized for the preparation of present
lecture.
Thank you !
Feel free to contact

Optical remote sensing

  • 1.
    APPLICATION OF REMOTESENSING AND GEOGRAPHICAL INFORMATION SYSTEM IN CIVIL ENGINEERING Date: INSTRUCTOR DR. MOHSIN SIDDIQUE ASSIST. PROFESSOR DEPARTMENT OF CIVIL ENGINEERING
  • 2.
    Optical Remote Sensing Opticalremote sensing makes use of visible, near infrared and short-wave infrared sensors to form images of the earth's surface by detecting the solar radiation reflected from targets on the ground Photography (Photogrammetry) Thermal Scanner Multispectral 2
  • 3.
    Optical remote sensingsystems are classified into the following types, depending on the number of spectral bands used in the imaging process. Panchromatic imaging system (1 bands) Multispectral imaging system (3-10 bands) Superspectral imaging system (10-50 bands) Hyperspectral imaging system (50-300 bands) Optical Remote Sensing Systems 3
  • 4.
    The sensor isa single channel detector sensitive to radiation within a broad wavelength range. If the wavelength range coincide with the visible range, then the resulting image resembles a "black-and-white“ photograph taken from space. The physical quantity being measured is the apparent brightness of the targets and the spectral information or "colour" of the targets is lost. Examples of panchromatic imaging systems are: Worldview-1 IKONOS PAN SPOT HRV-PAN Panchromatic Imaging System 4
  • 5.
  • 6.
    The sensor isa multichannel detector with a few spectral bands. Each channel is sensitive to radiation within a narrow wavelength band. The resulting image is a multilayer image which contains both the brightness and spectral (colour) information of the targets being observed. Examples of multispectral systems are; ALOS AVNIR-2 Landsat MSS/TM/ETM SPOT HRV-XS IKONOS MS Multispectral Imaging System 6
  • 7.
  • 8.
    Landsat 7 ETMTechnical Specification 8
  • 9.
    A superspectral imagingsensor has many more spectral channels (typically >10) than a multispectral sensor. The bands have narrower bandwidths, enabling the finer spectral characteristics of the targets to be captured by the sensor. Examples of superspectral systems are: MODIS MERIS Superspectral Imaging Systems 9
  • 10.
    Orbit:705 km, 10:30a.m. descending node (Terra) or 1:30 p.m. ascending node (Aqua), sun- synchronous, near-polar, circular Scan Rate: 20.3 rpm, cross track Swath Dimensions: 2330 km (cross track) by 10 km (along track at nadir) Telescope:17.78 cm diameter MODIS Specifications Size:1.0 x 1.6 x 1.0 m, Weight: 228.7 kg, Power:162.5 W Data Rate: 10.6 Mbps (peak daytime); 6.1 Mbps (orbital average) Quantization:12 bits Spatial Resolution:250 m (bands 1-2), 500 m (bands 3-7) and 1000 m (bands 8-36) Design Life: 6 years 10
  • 11.
    Band Wavelength (nm) Resolution (m) Primary Use 1 620–670 250mLand/Cloud/ Aerosols Boundaries2 841–876 250m 3 459–479 500m Land/Cloud/ Aerosols Properties 4 545–565 500m 5 1230–1250 500m 6 1628–1652 500m 7 2105–2155 500m 8 405–420 1000m Ocean Color/ Phytoplankton / Biogeochemist ry 9 438–448 1000m 10 483–493 1000m 11 526–536 1000m 12 546–556 1000m 13 662–672 1000m 14 673–683 1000m 15 743–753 1000m 16 862–877 1000m 17 890–920 1000m Atmospheric Water Vapor 18 931–941 1000m 19 915–965 1000m MODIS specifications Band Wavelength (µm) Resolution (m) Primary Use 20 3.660–3.840 1000m Surface/Cloud Temperature 21 3.929–3.989 1000m 22 3.929–3.989 1000m 23 4.020–4.080 1000m 24 4.433–4.498 1000m Atmospheric Temperature25 4.482–4.549 1000m 26 1.360–1.390 1000m Cirrus Clouds Water Vapor 27 6.535–6.895 1000m 28 7.175–7.475 1000m 29 8.400–8.700 1000m Cloud Properties 30 9.580–9.880 1000m Ozone 31 10.780–11.280 1000m Surface/Cloud Temperature 32 11.770–12.270 1000m 33 13.185–13.485 1000m Cloud Top Altitude 34 13.485–13.785 1000m 35 13.785–14.085 1000m 36 14.085–14.385 1000m 11
  • 12.
    It is alsoknown as an "imaging spectrometer". it acquires images in about a hundred or more contiguous spectral bands. The precise spectral information contained in a hyperspectral image enables better characterisation and identification of targets. Hyperspectral images have potential applications in such fields as precision agriculture (e.g. monitoring the types, health, moisture status and maturity of crops), coastal management (e.g. monitoring of phytoplanktons, pollution, bathymetry changes). Examples of a hyperspectral system are: Hyperion on EO1 satellite AVIRIS (Airborne Visible and Infrared Spectrometer) Hyperspectral Imaging Systems 12
  • 13.
  • 14.
  • 15.
    The science ofmaking measurements from photographs is called photogrammetry Evolution of photography Development of new photographic techniques and equipment Development of new platforms for collection of imagery Black and white photography Color photography Color infrared photography Photogrammetry 15
  • 16.
    How Do WeGet Image 16
  • 17.
    What is anImage 17
  • 18.
    A 3000-row by3000-column satellite image has three spectral channels. If each pixel is represented by 8 bits (1 byte) per channel, how many byte of computer memory are required to store the image? Digital Format of Image Pixels, are the smallest units of an image 18
  • 19.
    Digital Format ofImage 2D Image 19
  • 20.
    Digital Format ofImage 3D Image 20
  • 21.
  • 22.
    Anaglyph image isa method of encoding a three-dimensional image in a single picture by superimposing a pair of pictures The left image has the blue and green color channels removed to leave a purely red picture while the right image has the red channel removed. The two images are superimposed into one picture which produces a picture very like the original with a red and cyan fringes around objects where the stereo separation produces differences in the original images. Stereo Viewing - Anaglyph The concept of Stereo imaging is used in 3D movies !! 22
  • 23.
    A common typeof solid-state detector in current use is the charge coupled device (CCD). At a specific pixel location, the CCD element is exposed to incident light energy and it builds up an electric charge proportional to the intensity of the incident light. The electric charge is subsequently amplified and converted from analog to digital form. A large number of CCDs can be combined on a silicon chip in a one or two dimensional array. Digital Imaging Device 23
  • 24.
  • 25.
    Scale of VerticalPhotograph The ratio of distance on an image or map, to actual ground distance is referred to as scale. 25
  • 26.
    Geometry of VerticalPhotograph (Nadir) 26
  • 27.
    Types of distortioninclude: Atmospheric refraction of light rays Image motion or camera shake Lens distortion (interior orientation) Types of displacement include: Curvature of the Earth Tilt (exterior orientation) Topographic or relief (including object height) Types of Distortion and Displacement 27
  • 28.
    Since the atmospheredensity decrease at higher altitude, light rays do not travel in straight lines through the atmosphere. They are bent according to Snell’s law. A photogrammetric equations assume that light rays travel in straight paths, and to compensate for the known refracted paths, corrections are applied to the image coordinates. Atmospheric Refraction 28
  • 29.
    Small effects dueto the flaws in the optical components (i.e. lens) of camera systems leading to distortions. They are typically more serious at the edges of photos. These effects are radial from the principal point (making objects appear either closer to, or farther from the principal point than they actually are); and may be corrected using calibration curves. Lens Distortions (Interior Orientation) 29
  • 30.
    The geoid isan equipotential gravity surface, which is considered to be mean sea level. A reference ellipsoid is a mathematically defined surface which approximates the geoid globally. Curvature of Earth 30
  • 31.
    Definition of aReference Ellipsoid 31 a is the semi-major axis of the ellipse, and is identified as the equatorial radius of the ellipsoid b is the semi-minor axis of the ellipse, and is identified with the polar distances (from the centre) These two lengths completely specify the shape of the ellipsoid but in practice geodesy publications classify reference ellipsoids by giving the semi- major axis and the inverse flattening, 1/f. The flattening, f, is simply a measure of how much the symmetry axis is compressed relative to the equatorial radius: WGS84: World Geodetic System 1984 GRS80: Geodetic Reference System 1980
  • 32.
    All photos havesome tilt and the perfect gyro stabilization unit, like the perfect lens, has yet to be built. A tilted photograph presents a slightly oblique view rather than a true vertical record. Tilt is caused by the rotation of the platform away from the vertical. If the amount and direction of tilt are known then the photo may be rectified. Tilt Displacement Camera orientation for photographs (Exterior orientation) 32
  • 33.
    Collinearity equation isa physical model representing the geometry between a sensor (projection center), the ground coordinates of an object and the image Geometry between Image and Ground 33
  • 34.
    Rotation matrix ofground coordinate P(X, Y, Z) gives image coordinate P(up, vp, wp) Collinearity Equation (cont’d) 34
  • 35.
    This is typicallythe most serious type of displacement. This displacement radiates outward from Nadir. This is caused by the perspective geometry of the camera and the terrain at varying elevations. This is used for three measurements; Stereo viewing (anaglyph) Height measurement Topographic mapping (LiDAR) Topographic Displacement 35
  • 36.
    Relief displacement measurementon a single vertical photograph Stereoscopic measurement based on the parallax of two photographs Types of height measurement Height measurement with stereoscopic measurement Relief displacement 36
  • 37.
  • 38.
    Relief Displacement The effectof relief does not only cause a change in the scale but can also be considered as a component of image displacement. The distance d between the two photo points is called relief displacement because it is caused by the elevation difference h between A and A’ 38
  • 39.
    This is themost used method of measuring heights on air photos. There are many forms of the parallax equations. This corresponds to the distance between image points, of the same object on the ground, on the left and right image. The height difference can be computed if the parallax difference is measured between two points of different height, using a parallax bar. Parallax Height Method: Stereoscopic Parallax (Epipolar Geometry)39 Assignment: Derive the expression for h
  • 40.
    Orthorectification is theprocess by which the geometric distortions of the image are modeled and accounted for. The orthorecifitication process yields map- accurate images which can be highly useful as base maps and may be easily incorporated into a GIS. The success of the orthorectification process depends on the accuracy of the digital elevation model (DEM) and the correction formulae. Orthorectification 40
  • 41.
  • 42.
  • 43.
    Correlation Coefficient Computing 43 Assignment:Compute the correlation coefficient, c
  • 44.
    A digital elevationmodel (DEM) is defined as a file or database containing elevation points over a contiguous area. DEMs may be subdivided into: Digital surface models (DSM) that contain elevation information about all features in the landscape, such as vegetation, buildings, and other structures; Digital terrain models (DTM) that contain elevation information about the bare-Earth surface without the influence of vegetation or man-made structures. Four major technologies are used to obtain elevation information In situ surveying Photogrammetry Interferometric Synthetic Aperture Radar (IFSAR) Light Detection and Ranging (LiDAR) Digital Elevation Models 44
  • 45.
  • 46.
    Extracting terrain parameters Modelingwater flow or mass movement (e.g., avalanches and landslides) Creation of relief maps Rendering of 3D visualizations. 3d flight planning Creation of physical models (including raised-relief maps) Rectification of aerial photography or satellite imagery. Reduction (terrain correction) of gravity measurements (gravimetry, physical geodesy). Terrain analyses in geomorphology and physical geography Uses of Digital Elevation Models Geographic Information Systems (GIS) Engineering and infrastructure design Global positioning systems (GPS) Line-of-sight analysis Base mapping Flight simulation Precision farming and forestry Surface analysis Intelligent transportation systems (ITS) Auto safety / Advanced Driver Assistance Systems (ADAS) Archaeology 46
  • 47.
    Comments…. Questions…. Suggestions…. 47 I am greatlythankful to all the information sources (regarding remote sensing and GIS) on internet that I accessed and utilized for the preparation of present lecture. Thank you ! Feel free to contact