SlideShare a Scribd company logo
1 of 163
ANURAG UNIVERSITY
DEPARTMENT OF CIVIL ENGINEERING
B.Tech IV Year I Semester-Section A
Batch – 2020 to 2024
Subject
Geospatial Technology
Faculty
Poola Nagesh
Assistant Professor
Commencement of Instructions – 12th June 2023
Last Date of Instructions- 7th October 2023
Academic Calendar /Almanac
Unit 1
• Introduction to Photogrammetry:
Principle and types of aerial photographs,
stereoscopy, Map Vs Mosaic, ground control,
Parallax measurements for height, and
determinations.
• Remote Sensing – I: Basic concepts and
foundation of remote sensing – elements
involved in remote sensing, electromagnetic
spectrum, remote sensing terminology, and
units.
UNIT-II
Remote Sensing – II: Energy resources, energy
interactions with earth surface features and
atmosphere, resolution, sensors and satellite
visual interpretation techniques, basic elements,
converging evidence, interpretation for terrain
evaluation, spectral properties of water bodies,
introduction to digital data analysis.
BASICS OF PHOTOGRAMMETRY
Photogrammetry
 Introduction & Basic Terms of Pho to grammet ry
 Sign ificance of Ph otogrammet ry
 Types of Ph o togrammet ry
 Types of P h o to grap hs
 Scale of Vertical Photograph
 Relief Displacement
Introduction to Photogrammetry
Photogrammetry is a technique used to create accurate 3D models or
measurements of real-world objects or environments by analyzing
photographs or images. It involves extracting information about the
shape, size, and position of objects or features from a collection of
overlapping images.
The basic principle behind photogrammetry is triangulation. By capturing
multiple images of an object or a scene from different angles, the
software can determine the position of each point in 3D space by
analyzing the geometric relationships between the images. This process
involves identifying common features in the images and matching them
to establish correspondences.
Introduction
• There is no universally accepted definition of photogrammetry.
• The name “photogrammetry" is derived from the three Greek words
– phos or phot which means light,
– gramma which means letter or something drawn, and
– metrein, which means to measure.
• Photogrammetry is the science of obtaining reliable information about the
properties of surfaces and objects without physical contact with the
objects, and of measuring and interpreting this information.
• Is an art of obtaining reliable information about physical objects an
d the environment through processes of recording, measuring and
interpreting photographic images.
• The principal point of each photograph is used as a fixed station
a n d rays are drawn to get points of intersections very similar to
those used in plane table.
• As its name implies, is a 3-dimensional coordinate measuring
technique that uses
photographs as the fundamental medium for metrology (or measurement).
What is Photogrammetry?
Topographic Photogrammetry Close range Photogrammetry
Used for mapping earth or planets Used for industrial measurement
Imaging system is based on an aircraft / spacecraft Imaging system is handheld
Target is the ground surface Target is the object being measured
Image data is processed to create new spatial
information products
Image data is processed to make precise
measurements
Photograph vs Image
History & Development
• The idea of capturing photos started around 6th century
• The main stages of development in this area started around
1850
• The development stages can be categorized as Analog 
Analytical  Digital photogrammetry
History & Development
Mile stones of Photogrammetry
• Four major phases are directly related to the technological inventions of
photography, airplanes, computers and electronics.
• First generation, from the middle to the end of last century, was very much
a pioneering and experimental phase with remarkable achievements in
terrestrial and balloon photogrammetry.
Analog Photogrammetry
• Optical or mechanical instruments were used to reconstruct 3
dimensional geometry from 2 overlapping photographs.
• The main product during this phase was topographic maps
• Computer has replaced some expensive optical and mechanical
components.
• Resulting devices were analog / digital hybrids.
• Output of analytical photogrammetry were Topographic maps, digital
maps like DEM (digital elevation model) etc.
Analytical Photogrammetry
Digital Photogrammetry
• Digital photogrammetry is the art of using
computers to obtain the measurements of objects in a photograph
• The fourth generation digital photogrammetry is rapidly emerging as a
new discipline in photogrammetry. In contrast to all other phases,
digital images are used instead of aerial photographs.
Photogrammetry
What does Photogrammetry requires
• Planning and taking photographs
• Processing the photographs
• Measuring photos and reducing measurements to produce
end results.
Field Applications
• Used to conduct topographical survey or engineering surveys
• Suitable for mountains / hilly areas with little vegetation
• Used for geological mapping which includes identification of
land forms, rock type etc
• Used in urban, regionalplanning,disaster management,
transportation etc
P h o t o g r a m m e t r y
• Advantages
– Used in different fields, such as topographic mapping,
engineering, manufacturing.
– Covers larger areas
– Less time consuming
– Can reach inaccessible and restricted areas
– Economical for large areas and Easy to interpret and understand.
• Disadvantages
– Complex system, highly trained human resource needed
– Initial investment cost , Heavy and sophisticated equipments needed
– Weather dependent.
The process of photogrammetry typically involves several steps:
Image Acquisition: Capturing a series of overlapping photographs or
images of the object or scene from different angles. The images should
have sufficient detail and overlap to ensure accurate reconstruction.
Image Preprocessing: Cleaning and preparing the images for further
analysis. This may involve adjusting image parameters, such as brightness,
contrast, and color balance, as well as removing lens distortion.
Feature Extraction: Identifying and extracting common features, such as
corners or edges, from the images. These features serve as reference
points for matching and establishing correspondences between images.
Image Matching: Comparing the extracted features across different
images to find corresponding points. This process involves algorithms that
analyze the visual similarities between the features and establish accurate
correspondences.
Bundle Adjustment: Refining the camera positions and orientations by
minimizing the errors in the matching process. Bundle adjustment
ensures the accuracy and consistency of the reconstructed 3D model.
Dense Point Cloud Generation: Creating a dense representation of the
object or scene by estimating the 3D coordinates of numerous points on
its surface. This is achieved by triangulating the matched features and
estimating their positions in 3D space.
Surface Reconstruction: Creating a continuous and detailed surface
model from the dense point cloud. Various algorithms, such as Delaunay
triangulation or mesh generation methods, can be used to create a mesh
representing the object's surface.
Texture Mapping: Applying the original images onto the reconstructed
surface to generate a realistic and textured 3D model. This process
enhances the visual appearance of the model by projecting the images
onto the corresponding surface areas.
Photogrammetry has a wide range of applications in various
industries, including architecture, engineering, cultural heritage
preservation, virtual reality, and video game development. It
enables the creation of accurate 3D models, measurements, and
visualizations, providing valuable data for analysis,
documentation, and immersive experiences.
Types of Photographs
T Y P ES O F P H OTOG RAP H S
• The photographs used in photogrammetry may be broadly
classified into two types depending upon the camera position at the
time of photography.
• The types are- Terrestrial Photographs and Aerial Photographs
Aerial photogrammetry is commonly used in mapping, surveying, and
remote sensing applications. It involves capturing images from an
elevated position, such as an aircraft or a drone, to create large-scale
maps or 3D models of landscapes, buildings, or entire cities.
Terrestrial photogrammetry, on the other hand, focuses on capturing
images from ground-level perspectives. It can be used to create detailed
models of smaller objects, such as archaeological artifacts, sculptures, or
architectural elements. Terrestrial photogrammetry often requires careful
planning of image acquisition and the use of specialized equipment, such
as calibrated cameras and precise measurement tools.
Principle and types of aerial photographs
The principle of aerial photography involves capturing photographs of the
Earth's surface from an elevated position, typically from an aircraft or a
drone. These photographs provide a bird's-eye view perspective and are
used in various applications such as mapping, land surveying,
environmental monitoring, and urban planning. The key principles of aerial
photography include:
Overlapping Coverage: Aerial photographs are captured with a significant
amount of overlap between successive images. This overlap ensures that
there are common features visible in multiple photographs, allowing for
accurate measurement and reconstruction of the terrain or objects.
Scale and Ground Resolution: Aerial photographs have a specific scale
determined by the altitude of the aircraft and the focal length of the camera.
Ground resolution refers to the level of detail captured in the photograph, usually
expressed as the size of a ground feature represented by a pixel in the image.
Vertical Perspective: Aerial photographs are typically captured with a vertical
perspective, meaning that the camera axis is pointed straight down towards the
Earth's surface. This perspective minimizes distortions and simplifies the process
of geometric correction and measurement.
Stereoscopic Vision: Aerial photographs are often captured in pairs or sets to
enable stereoscopic viewing. Stereoscopic vision allows for depth perception and
the creation of three-dimensional models by observing the displacement of
features in the overlapping photographs.
There are different types of aerial photographs used in various applications.
The main types include:
Vertical Aerial Photographs: These are captured with the camera axis pointed
straight down, providing a vertical perspective. Vertical photographs are
commonly used for mapping, land surveying, and photogrammetric analysis.
They provide accurate measurements and allow for the creation of ortho
photos (geometrically corrected aerial images) and digital elevation models
(DEMs).
Oblique Aerial Photographs: In oblique aerial photography, the camera is tilted
at an angle, capturing images with a non-vertical perspective. Oblique
photographs are often used for visual interpretation, architectural
photography, and landscape analysis. They provide a more artistic or dramatic
representation of the subject but are less suitable for precise measurement or
mapping.
Near-vertical Aerial Photographs: Near-vertical photographs are
captured with a slight tilt, providing a perspective that is between
vertical and oblique. These photographs offer a balance between the
accuracy of vertical photographs and the visual context provided by
oblique photographs. They are useful for applications such as urban
planning, site analysis, and environmental studies.
Thermal Infrared Aerial Photographs: Thermal infrared aerial photography
involves capturing images in the thermal infrared spectrum. These
photographs can reveal variations in temperature, which can be useful in
applications such as environmental monitoring, agriculture, and heat loss
analysis of buildings.
Multispectral Aerial Photographs: Multispectral aerial photographs are
captured using cameras equipped with filters sensitive to different parts of the
electromagnetic spectrum. These photographs provide information beyond
the visible range, enabling analysis of vegetation health, soil composition,
and other environmental factors.
Each type of aerial photograph has its own advantages and applications,
and the selection depends on the specific needs of the project or study.
TERRESTRIAL PHOTOGRAPHS
TERRESTRIAL PHOTOGRAPHS
AERIAL PHOTOGRAPHS
• Aerial photographs can beclassified as oblique or
vertical, based on the orientation of the camera relative to the
ground during the time of acquisition.
Vertical Photographs
• These photographs are taken from the air with the axis of
the camera vertical or nearly vertical.
• A truly vertical Photograph closely resembles a map.
• These are utilized for the compilation of topographic and
engineering surveys on various scales.
Oblique Photographs
Oblique Photographs
Oblique vs Vertical Photographs
Oblique Photo Vertical Photo
Oblique vs Vertical Photographs
Principle and types of stereoscopy
The principle of stereoscopy is based on the concept of binocular vision,
which is the ability of humans and some animals to perceive depth and
three-dimensional information by processing the slightly different views
seen by each eye. Stereoscopy mimics this natural visual perception by
presenting two slightly offset images to each eye, creating the illusion of
depth and a sense of three-dimensionality.
The key principle of stereoscopy involves the following:
Stereoscopic Pair: A stereoscopic pair consists of two images, called left and
right views, which are captured or created from slightly different
perspectives. These images represent what each eye would see if viewing
the scene separately.
Parallax: Parallax is the apparent shift or displacement of objects when
viewed from different positions. In stereoscopy, the offset between the
left and right views creates parallax, and this parallax is used to generate
the perception of depth.
Binocular Fusion: Binocular fusion is the brain's ability to merge the two
slightly offset images into a single perception of a three-dimensional
scene. By aligning and combining corresponding points in the left and
right views, the brain reconstructs the depth and spatial relationships of
the objects in the scene.
There are different types of stereoscopy techniques used to create the
illusion of depth and provide a three-dimensional viewing experience. Some
common types include:
Anaglyph Stereoscopy: Anaglyph stereoscopy involves using specially filtered
glasses with one red lens and one cyan, green, or blue lens. The left and right
views are presented as differently colored images (usually red and cyan).
When viewed through the corresponding color filters, the brain merges the
images and perceives depth.
Polarized Stereoscopy: Polarized stereoscopy uses glasses with different
polarizing filters for each eye (usually circular or linear polarization). The left
and right views are projected or displayed with matching polarizations. The
polarized glasses filter out the respective views, allowing each eye to see the
intended image, resulting in a stereoscopic effect.
Shutter Glasses Stereoscopy: Shutter glasses, also known as active shutter
glasses, are synchronized with a display device that alternates between showing
the left and right views rapidly. The glasses contain liquid crystal lenses that
open and close in sync with the display, allowing each eye to see the correct
image. This technique provides high-quality stereoscopic images but requires
compatible hardware.
Autostereoscopy: Autostereoscopy refers to the ability to view three-
dimensional images without the need for specialized glasses or additional
devices. Various autostereoscopic methods exist, including lenticular lenses,
parallax barriers, and holography. These techniques use optical elements to
direct different views to each eye, enabling the perception of depth without the
need for additional aids.
Stereoscopy has applications in various fields, including entertainment
(3D movies, virtual reality), scientific visualization, medical imaging,
and product design. It provides an immersive and realistic viewing
experience by adding depth and enhancing the perception of the three-
dimensional world.
What are 3D Glasses?
These glasses utilize special red
/ cyan lenses to interpret the
image. These lenses produce
the images you see by color
filtering the layered image that
you're actually looking at.
While one lens filters out all the
red in an image, the other lense
filters out the cyan, causing
your brain to see the picture in
3D
Map Vs Mosaic
Map and mosaic are both terms used in the context of geospatial data and
imagery, but they refer to different concepts:
Map: A map is a visual representation or depiction of geographic information,
typically on a flat surface. It can be a two-dimensional representation of the
Earth's surface or a specific area, showing various features such as landforms,
roads, rivers, political boundaries, and other relevant information. Maps are
created using cartographic techniques and can be used for navigation, analysis,
planning, and communication of spatial information.
Maps can be created through various means, including manual drawing,
computer-aided design (CAD), or digital mapping software. They often involve
the aggregation and symbolization of different data layers to convey specific
information and facilitate understanding of spatial relationships.
Mosaic: A mosaic, in the context of geospatial imagery, refers to the
process of combining multiple individual images or photographs to
create a seamless and continuous image of a larger area. It involves
aligning and blending the individual images to create a composite
representation of the entire scene.
Mosaicking is commonly used in remote sensing and aerial
photography applications, where multiple overlapping images are
captured to cover large areas. By stitching these images together, a
single mosaic image is created, which provides a comprehensive view
of the entire region. Mosaics can be created with various software
tools and algorithms that handle image registration, color correction,
and blending.
Mosaics are useful for visual interpretation, analysis, and mapping
purposes. They allow for a comprehensive view of large areas, enabling the
identification and understanding of spatial patterns, land cover changes,
and other geospatial features. Mosaics can also serve as base maps for
further analysis or overlaying additional data layers.
In summary, while a map represents spatial information in a two-
dimensional format, a mosaic is a composite image created by stitching
together multiple individual images to cover a larger area. Maps provide a
general representation of geographic features and their relationships,
while mosaics offer a detailed visual depiction of the actual surface as
captured by imagery.
Ground control
Ground control, in the context of geospatial data and mapping, refers to
the use of known reference points or features on the Earth's surface to
establish the accuracy and spatial positioning of remotely sensed
imagery, aerial photographs, or other geospatial data. Ground control
points (GCPs) are typically physical features with known coordinates that
are visible in both the imagery and on the ground. These GCPs are used to
align, georeference, and rectify the imagery to a known coordinate
system.
Ground control
The process of ground control involves the following steps:
Selection of Ground Control Points: Several GCPs are selected on the ground,
and their coordinates are measured using surveying techniques such as
Global Navigation Satellite Systems (GNSS) or Total Station. GCPs should be
easily identifiable and have stable positions over time.
Identification in Imagery: The selected GCPs are identified in the imagery or
remotely sensed data. This can be done manually by visually matching the
GCPs in the imagery or through automated algorithms that detect and match
features between the ground and the imagery.
Georeferencing: The coordinates of the GCPs on the ground are associated
with their corresponding locations in the imagery. This process involves
determining the transformation parameters to align the imagery with the
desired coordinate system. Common transformation methods include affine,
polynomial, or projective transformations.
Rectification: Once the imagery is georeferenced, it can be rectified to
remove any distortions caused by terrain relief or sensor characteristics.
Rectification ensures that the image has a uniform scale and represents the
Earth's surface accurately. It involves resampling the imagery to a regular grid
and correcting for any geometric distortions.
Ground control points are essential in various applications, including:
Orthorectification: GCPs are used to rectify aerial photographs or satellite
imagery to produce orthorectified images, which have a uniform scale and can
be used for accurate measurement, mapping, and analysis.
Geospatial Data Integration: GCPs enable the alignment and integration of
different geospatial data sources, such as aerial imagery, LiDAR [Light
Detection and Ranging] data, and satellite imagery, to create a consistent and
accurate geospatial dataset.
Change Detection: GCPs provide a reference framework to detect and quantify
changes over time by comparing imagery captured at different periods.
Ground control points are essential in various applications, including:
Accuracy Assessment: GCPs serve as ground truth points for evaluating the
accuracy of geospatial products, such as orthophotos, maps, or digital
elevation models (DEMs).
Overall, ground control plays a critical role in establishing the accuracy,
georeferencing, and rectification of geospatial data, ensuring that the data
can be used reliably for mapping, analysis, and decision-making.
Parallax measurements for height and determinations
Parallax measurements can be used to determine the height or distance of an object
or feature by exploiting the principle of triangulation. The basic concept involves
observing an object from two different positions and measuring the angular
displacement or parallax between the two views. By knowing the baseline distance
between the two viewpoints and the measured parallax angle, it is possible to
calculate the height or distance to the object using trigonometry.
Here's a general overview of how parallax measurements can be used for height and
distance determinations:
Baseline Setup: Establish two observation points or viewpoints from which the object
or feature of interest can be observed. The distance between these viewpoints is
referred to as the baseline distance. It is crucial to accurately measure or know the
baseline distance for accurate results.
Observations: From each viewpoint, observe the object or feature, making sure it
is visible and identifiable in both views. Note the angular displacement or parallax
between the two views. Parallax is the apparent shift of the object's position when
viewed from different positions.
Parallax Angle Measurement: Measure the parallax angle, which is the angular
displacement of the object as observed from the two viewpoints. This can be done
using instruments such as theodolites, total stations, or specialized
photogrammetric equipment.
Trigonometric Calculation: With the baseline distance and the measured parallax
angle, trigonometric calculations can be used to determine the height or distance
to the object. The specific calculations depend on the geometry of the setup and
the type of parallax being measured.
Determining Object Height: If the object is on or near the Earth's surface, the height can be
calculated using simple trigonometry. The height (h) of the object is given by the formula: h =
b * tan(α), where b is the baseline distance and α is the parallax angle.
Determining Object Distance: If the object is at a significant distance and its height is known,
the distance (d) to the object can be calculated using trigonometry. The distance is given by
the formula: d = h / tan(α), where h is the object height and α is the parallax angle.
It's important to note that accurate measurements of the baseline distance and the parallax
angle are crucial for obtaining reliable height or distance determinations. Additionally,
factors such as atmospheric conditions, instrument accuracy, and object visibility should be
considered to ensure accurate results.
Parallax measurements can be applied in various fields, including surveying,
photogrammetry, astronomy, and remote sensing. They provide a method for estimating
heights or distances without directly accessing the object, making them useful for
inaccessible or remote locations.
• The scale of the vertical photograph is the ratio of a distance
on the photo to the corresponding distance to the ground.
• Scale varies for points of different elevation
• Scale remains constant for only those points having same elevation.
• If all points are on the same elevation then the scale is given as follows
SCALE OF VERTICAL PHOTOGRAPH
H= flying height of the camera
f = focal length of camera
h= height of ground above mean
sea level
• Case 2: If A and B are two points having
elevations ha and hb respectively
above mean sea level.
• Scale of photograph at elevation ha
• Scale of photograph at elevation hb
Scale of photograph for different elevations (h)
This can be represented by Representative Fraction
(Rn) also as
SCALE OF VERTICAL PHOTOGRAPH
SCALE OF VERTICAL PHOTOGRAPH
• Datum Scale – If all points are projected at Mean Sea Level
• Average Scale – If all points are projected on a plane representing the
average elevation.
Coordinate of Point A and B on ground in plan are A  Xa, Ya & B Xb,Yb
Corresponding points on photograph are A  xa, ya & B xb,yb
For Point A from Similar Triangle For Point B
So,
Computation of Length of line between points of different elevation
So X and Y of any point
Length between two points A and B are given
as
Relief Displacement
• When the ground is not horizontal then the scale of the photograph varies from
point to point. Scale of an aerial photograph is partly a function of flying height.
Therefore variations in elevation cause variations in scale of aerial photographs
and every point on the photograph is displaced from its true position.
• The effect of relief does not only cause a change in the scale but can also be
considered as a component of image displacement. Higher the elevation of an
object, the farther the object will be displaced from its actual position away from
the principal point of the photograph (the point on the ground surface that is
directly below the camera lens). Lower the elevation of an object, the more it will
be displaced towards the principal point. This effect, called relief displacement.
Due to different elevation of different points every point on photograph is
displaced from their original position. This displacement is called relief
displacement.
• Consider point T is on top of a building and point B at the bottom. On a
map, both points have identical X, Y coordinates; However, on the
photograph they are imaged at different positions, namely in T and B’. The
distance d between the two photo points is called relief
displacement because it is caused by the elevation difference Δh between T
and B.
Relief Displacement
Relief Displacement
Relief Displacement =
If Relief Displacement is known then height of the object h = dH
/ R
Problem –
• The distance from the principal point to an image on a photograph is 6.44cm
and the elevation of the object above the datum is 250m. What is the relief
displacement of the point if the datum scale is 1/10000 and the focal length
of the camera is 20cm
– The datum scale is given by Sd = 1/10000 ; r = 6.44cm, h = 250m
But scale = focal length  1 = 20 H
Height
100
00
 H = (20 * 10000)/100
 H = 2000 m above msl
10
0
Relief displacement (d) is given by  d =
rh/H
 d = (6.44 * 250)/100 * 2000
 d = 0.805cm
Photo scale
• We use the representative fraction for scale expressions, in form of a ratio, e.g. 1 :
5,000.
• The scale of a near vertical photograph can be approximated by the equation
• Where mb is the photograph scale number,
• c the calibrated focal length, and
• H the flight height above mean ground elevation.
• Note that the flight height H refers to the average ground
• elevation. If it is with respect to
the datum, then it is called flight altitude HA, with HA = H + h.
• The photograph scale varies from point to point.
• For example, the scale for point P can easily be determined as the ratio of
image distance CP’ to object distance CP by
• where
• xP , yP are the photo-coordinates,
• XP , YP
, ZP the ground coordinates of point P
, and
• XC, YC, ZC the coordinates of the projection center C in the ground coordinate
system.
• The above equation takes into account any tilt and topographic variations of the
surface (relief).
Photo scale
Why Flight Planning??
• To ensure coverage of an area to be mapped at the required scale without any
gaps in between.
• The product of a flight pan is basically a map
• A flight plan will determine the spacing between successive photographs,
locations of flight lines and start and end locations of each flight path.
Flight Planning
• Success of any photogrammetric project depends mainly on
acquisition of good quality pictures.
• Due to weather and terrain conditions, time frames for photography is
limited.
– Ideally cloud cover < 10 % is acceptable.
– Clouds higher than the flying height might cast large shadows
on the ground.
– Windy days may cause excessive image motion and difficulties in camera and
aircraft orientation.
– Areas near to industries may be susceptible to atmospheric haze, smog
• Re-flights can be very expensive and causes delay in the projects.
• Therefore the flight path must be carefully planned and executed accordingly.
Flight Planning
Overlapping Imagery
• Aerial photographs taken are rarely a single shot event. Multiple photographs
are taken along a flight path to ensure complete photo coverage of an area.
• To achieve this we must have overlap between photo images. The
overlapping of photos end to end is termed end lap.
• An end lap of 30% will avoid potential missing areas of coverage due to the
effects of turbulence.
• Imagery collected for use in stereo viewing, an end lap of 60% is considered
ideal.
• For block coverage of an area at a specific photo scale, it is often necessary
to fly parallel strips of aerial photography.
• The adjacent strips also overlap each other. This overlapping area is termed
side lap and is generally specified at 30% to ensure good coverage.
Flight Planning
Overlap in Photographs
Overlap in Photographs
Side lap between adjacent strips
Effect of topography on scale and overlap
No of photographs required
Let A = Total area to be photographed
l = Length of photograph in the direction of flight
w =Width of the photograph normal to direction of flight
s = Scale of the photograph
L = Net Ground Distance corresponding to l
W = Net Ground Distance corresponding to w
a = Net Ground Area covered by each photograph = L * W
Pl = percentage overlap between successive photographs in direction of
flight Pw = side lap
Since each photograph has a longitudinal overlap of Pl, the actual ground
length (L)
covered by each photograph is given by L = (1-Pl) sl
Similarly , the actual ground width (W) covered by each photograph is given by
W = (1-Pw) sw
Hence ground area covered by each photograph
a= L * W = (1-Pl) sl * (1-Pw) sw
Number of photographs required (N) = A/a
• The scale of an aerial photograph is 1cm = 100m. The photograph size is
20cm * 20cm.Determine the number of photographs required to cover an
area of 100sq.km, if the longitudinal lap is 60% and side lap is 30%.
• GIVEN DATA  l=20cm, w=20cm,Pl = 0.60, Pw = 0.30,s=100
• The actual ground length covered by each photograph is
L = (1-Pl)sl = (1-0.60)*100*20 = 800m or 0.8km
• The actual ground width covered by each photograph is
W = (1-Pw)sw = (1-0.30)*100*20 = 1400m or 1.4km
• Net area covered by each photograph is
a= L * W
a = 0.8 * 1.4
a =1.12 Sq.km
• Hence no of photographs required is N = A/a = 100/1.12 = 90
BASIC TERMS USED IN PHOTOGRAMMETRY
• Tilted Photograph: An aerial photograph taken with a camera having it’s
optical axis tilted usually less than 3º from the vertical is known as tilted
photograph.
• Exposure (or air) station: The exact position of the front nodal point of the
lens in the air at the instant of exposure.
• Flying height (H): The elevation of the air station above the mean sea level
is
known as flying height of the aircraft.
• Nadir Point (Plumb Point): The point where a plumb line dropped from the
front
nodal point strikes the Photograph
BASIC TERMS USED IN PHOTOGRAMMETRY
• Camera Axis: It is the line passing through the centre of the camera lens
perpendicular both to the camera plate (negative) and the picture plane
(photography).
• Fiducial mark: A fiducial mark is one of two, three or four marks, located in
contact with the photographic emulsion in a camera image plane to
provide a reference line or lines for the plate measurement of images.
• Principal Point: The point where a perpendicular dropped from the front
nodal point strikes the photographs is known as principal point of
photograph
• Focal length: It is the perpendicular distance from the centre of the camera
lens to either the picture plane or the camera plate.
Basic Concepts of
Remote Sensing
REMOTE SENSING
 The study of something without making actual contact with the object
 Making measurements of the Physical properties of an object from
a remote distance
 Satellite technology is an example of remote sensing
 Satellites measure properties of the Earth and transmits the data
to receiving stations
REMOTE SENSING includes all methods and techniques used to gain
qualitative and quantitative information about distant objects without
coming into direct contact with these objects.
 Look-Look, NO Touch
Remote Sensing (RS) methods try to answer four basic
questions:
HOW MUCH of WHAT is WHERE?
What is the SHAPE and EXTENT of ... ?
(Area, Boundaries,Lineaments, ...)
Has it CHANGED?
What is the MIX of Objects
HOW MUCH of WHAT is WHERE?
WHAT: Type, Characteristic and Properties of Object.
For example:- Water, Vegetation, Land; Temperature, Concentration, State
of Development; Subtype, Species, Use of ... ; Includes determination of
generic object type, character and property as well as it’s abstract meaning.
DATAINTERPRETATION
HOW MUCH: determine by simple COUNTING, measuring
AREA covered or percentage of total area coverage.
WHERE: Relate locations and area covered to either a standard map or
to the actual location on the ‘ground’ where the object occurs.
NOTE: WHERE also refers to a moment in time
What is the SHAPE and EXTENT of ... ? (Area, Boundaries,Lineaments, ...)
This extends the ‘WHERE’ to be a completely GEOMETRIC problem. MAP
PRODUCTION methods are to be applied to the analysis of RS information.
These include:
Photogrammetric Methods:
Identification and Delineation of Boundaries and Lineaments
(Roads, Rivers, Fault Lines)
Has it CHANGED?
CHANGE may occur with progress of TIME.
Change may be detected through comparison of observed states at
different moments in time.
=> CHANGE DETECTION
What is the MIX of Objects?
The surface of the Earth is covered by objects like Soil, Water, Grass, Trees,
Houses, Roads and so on. These are ‘GENERIC OBJECTS’.
We know these well, but we also know objects like Open Forest,
Residential and Industrial Estates, etc.
Each of these ABSTRACT OBJECTS are made up of a typical
collection of Generic Objects
"Remote sensing is the science (and to some extent, art) of acquiring
information about the Earth's surface without actually being in contact
with it. This is done by sensing and recording reflected or emitted
energy and processing, analyzing, and applying that information."
Object (generic)
Sensor
System
eg.
Camera
DATA
ACQUISITION
Reflection
Source of
Force
Field
Resulting RS Data
Set
eg. Image
The Process of Remote Sensing
A. There are interactions with the
atmosphere
B. The energy reaches the target, or
object on Earth being studied and
interacts with the target based on the
target’s properties.
C. Energy scattered by or emitted from
the target is then collected by the
sensor
D. The sun, or the satellite itself, is the
energy source that provides
electromagnetic energy
The Process of Remote Sensing
E. The sensor transmits the electronic
information to a receiving and
processing station. Here, it is
processed into an image
F. The processed image is then
interpreted to learn about the target
G. The information is applied so that we
better understand the target, learn
something new about the target, or
solve a particular problem
ELECTROMAGNETIC SPECTRUM
Electromagnetic energy travels in waves and spans a broad spectrum
from very long radio waves to very short gamma rays. The human eye
can only detect only a small portion of this spectrum called visible
light. A radio detects a different portion of the spectrum, and an x-ray
machine uses yet another portion.
ELECTROMAGNETIC SPECTRUM
ELECTROMAGNETIC SPECTRUM
Radiation energy that is emitted in wave form by all substances is the basis for
all remote sensing of the earth
Electromagnetic Radiation
Electromagnetic radiation consists of an electrical field, E and a magnetic
field, M. Both of these fields travel at the speed of light, c. Different kind of
electromagnetic radiation can be distinguished by wavelength and
frequency.
Energy Source and its Characteristics
All objects whose temperature is greater than an absolute zero
(273°k), emit radiation. All stars and planets emit radiation. Our
chief star, the sun is almost a spherical body with a diameter of
1.39 x 10^6 km at a mean distance from the earth equal to 1.5 x
10^8 km. The continuous conversion of hydrogen to helium which
is the main constituents of the sun, generates the energy that is
radiated from the outer layers. If the energy received at the edge
of earth's atmosphere were distributed evenly over the earth, it
would give an average incident flux density of 1367 w/m2. This is
known
Visible Light ⚫ Visible light is light that our eyes can see
⚫ Visible light makes up an extremely small
part of the electromagnetic spectrum
⚫ Range from about 0.4 to 0.7µm
⚫ Blue, red and green are the primary colors
of light. All other colors can be made by
combining them in various proportions.
here for an interesting activity.
⚫ Each color has a different wavelength. Red
has the longest wavelength and violet has
the shortest wavelength. When all the waves
are seen together, they make white light.
INTERACTION OF EM RADIATION WITH ATMOSPHERE
EM radiation must pass through atmosphere in order to reach earths
surface and to the sensor after reflection and emission from earth surface
features.
Water vapour, oxygen, ozone, CO2, Aerosols etc present in atmosphere
influence electro magnetic radiation through the mechanism of
SCATTERING and ABSORPTION.
INTERACTION OF EM RADIATION WITH ATMOSPHERE
SCATTERING
Unpredictable diffusionof radiation by molecules of gas, dust
and smoke. Scattering reduces contrast and changes the
spectral signatures of ground objects.
Rayleigh’s Scattering: occurs when the diameter of the gas molecules or
particles is much less than the wave length of radiation. Lesser the wave
length, more is the scattering.(Sky appears blue during mid day and red
during sunrise and sunset)
Mie Scattering: occurs where the diameter of water vapour or dust
particles approximately equals the wave length of radiation.
Non Selective Scattering: occurs when the diameter of the particles
is several times more than radiation wavelength(approx ten times).
Pollen grains, Cloud droplets, Ice and snow crystals and rain drops are main
sources of Non selective scattering. These scatter all the wavelengths of
visible light with equal efficiency (This is the reason for cloud to appear in
white colour)
ABSORPTION ---Atmospheric absorption results the effective loss of
energy due to atmospheric constituents. Oxygenis observed in
ultraviolet region ; CO2 also prevents the energy
reaching earth ; Water vapour is an extremely important absorber with in
infrared part of the spectrum. The wavelengths at which EM radiations are
partially or wholly transmitted through the atmosphere are known as
ATMOSPHERIC WINDOWS and are used to acquire remote sensing data.
REMOTE SENSING OBSERVATION PLATFORMS
1)Air Borne Platforms and (2) Space based platforms Air
Borne Platforms (earlier days)
Three types of aircrafts are currently used for Remote sensing
operations: Dakota,AVRO and Beach-Craft Superking Air 200.
Expensive and cannot provide cost and time effective solutions.
What Are Satellites?
⚫Satellites are smaller objects traveling around larger objects
⚫Satellites may be man-made or natural, like the moon
⚫The two main types of satellites are polar-orbiting and
geostationary
⚫Satellites are designed for three general purposes: science,
applications, or communications
Space based platforms
Artificial Satellites
Artificial Satellites are human-made space craft that are built and sent into
space by people. These spacecraft can be crewed, such as the Space Shuttle, or
uncrewed, such as NASA’s Hubble Space Telescope
Hubble Space
Telescope
Communications
Satellite
NPOESS
Satellite
Polar-Orbiting Satellites
Polar orbiting satellites travel in a circular pattern over the North and the South
Poles, so they can look at large portions of the Earth as it turns below them.
Polar-orbiting satellites are placed into a low-Earth orbit.
They orbit at about 800 kilometers (500 miles) above the Earth.
They travel at about 17,000 miles per hour
N PO ES
S
Local Equatorial
Crossing
Time
NPOESS
Geostationary Satellites
Geostationary satellites orbit the Earth at about 36000 Km above the
equator.
Seen from Earth, the satellite appears to be floating over a certain
spot on the equator.
They are primarily used for weather and communication
Scientific Satellites
⚫ Most well-known type of satellite
⚫ Information from these satellites clarify the Earth’s history, present
condition, and what the future may hold
⚫ Other scientific satellites look away from the Earth, studying the sun,
stars, planets and other aspects of the universe
Application/Weather Satellites
⚫ Application satellites are used to test and develop
ways to improve global weather forecasting
⚫ These satellites are vital in predicting where and
when tropical storms, hurricanes, floods,
cyclones, tidal waves and forest fires may strike
⚫ The Television Infrared Observation Satellite
(TIROS), launched in 1960, was the first of a
series of meteorological satellites to carry
television cameras to photograph the Earth’s
cloud cover for research and forecasting
Application/Weather Satellites
⚫ Later satellites, like the series of Nimbus satellites
first launched in 1964, had infrared cameras as
well. These satellites improved upon storm and
hurricane forecasting and played a major role in
the study of ozone depletion
Communications Satellites
⚫First commercial satellites
⚫Aluminum-coated balloons were the first
communications satellites
⚫The first commercially-launched satellite was
Telestar 1, launched by AT&T in 1962. It
transmitted photos and phone calls between
America and Europe. This satellite was capable
of 600 phone Communications satellites were
the channels or one television channel
⚫Today, satellites like Intelsat provide up to
120,000 simultaneous two-way telephone circuits
REMOTE SENSING SATELLITES
Landsat Satellite Programme
NationalAeronautical and SpaceAdministration(NASA) of USA planned the
launching of a series of Earth Resources Technology Satellites (ERTS) and
consequently ERTS-I was launched in July 1972.
ERTS-I was later renamed as LANDSAT-1 and Five (5) Landsat satellites were
launched till now.Extensively used for agriculture, civil engineering, forestry,
geography, geology, land use planning, oceanography.
SPOT Satellite Programme
France,Sweden and Belgium jointly developed SYSTEM PORUL
OBSERVATION DELA TERRE(SPOT).
First satellite was launched in Feb 1988. Extensively used for Urban Planning,
Urban growth assessment,Transportation planning etc.
Indian Remote Sensing Satellites(IRS)
Satellite for earth Observation(SEO-I) , now called as Bhaskara-I was the
first Indian Remote Sensing satellite in 1979.
IRS1A, IRS1B, IRS1C, IRS ID & IRS P4, RESOURCE SAT,CARTOSAT etc
are few satellites launched by India.
SENSORS
Sensors are electronic instruments that receive EM radiation and
generate an electric signal that correspond to the energy variation of
different earth surface features. The signal can be recorded and
displayed as numerical data or an image. A scanning system employs
detectors with a narrow field of view which sweeps across the terrain
to produce an image.
Sensors on board of IRS
1)Linear Imaging and Self scanning sensor(LISS I & LISS II) :was
onboard IRS 1A and 1B satellites. It had 4 bands operating in visible
and near infrared region.
2)Linear Imaging and Self scanning sensor(LISS III): was onboard IRS
1C and 1D satellites. It has 3 bands in visible and near Infrared
region and 1 band in short wave infra region.
3)Panchromatic Sensor(PAN): was onboard IRS 1C and 1D satellites. It
has one band.
4)Wide Field Sensor(WiFS): was onboard IRS 1C and 1D satellites. It
has two bands operating inVisible and Near Infra region.
5)Modular Opto-Electronic Scanner(MOS): was onboard IRS P3
satellite.
6)Ocean Colour Monitor(OCM): was onboard IRS P4 satellite. It has 8
spectral bands operating in visible and near IR region.
7)Multi Scanning Microwave Radiometer(MSMR): this is on board IRS
1D satellite.It is a passive microwave sensor.
Types of remote sensing
Passive: source of energy is either the Sun or
Earth/atmosphere
◦ Sun
-wavelengths: 0.4-5 µm
◦ Earth or its atmosphere
-wavelengths: 3 µm -30 cm
Active: source of energy is part of the remote sensor system
◦ Radar
-wavelengths: mm-m
◦ Lidar
-wavelengths: UV
, Visible, and near infrared
Spatial Resolution
Spatial resolution and coverage
 Instantaneous field-of-view (IFOV)
 Pixel: smallest unit of an image
 Pixel size
Spatial coverage
 Field of view (FOV), or
 Area of coverage, such as MODIS: 2300km or global coverage,
weather radar (NEXRAD): a circle with 230 km as radius
30 meter, spatial
resolution
1 meter, spatial
resolution
Applications of Remote Sensing
1) Agriculture - Monitoring crop condition ,Identification of crops and their
coverage estimation, Detection of moisture stress in crops, Desertification
2)Forestry - Improved forest type mapping ,Monitoring large scale
deforestation, forest fires, Monitoring urban forestry
,Wild life habitat
assessment
3)Land use and soils - Mapping land use land cover, Change detection, Soil
Categorization
4)Urban Land Use - Transport studies ,Monitoring urban sprawl, Identification
of unauthorized Structures, Navigation purposes.
5)Water resources
Monitoring of surface water bodies ,Glacier inventory
6)Watershed - Delineation of watershed boundaries, Silting of water
harvesting structures, major river valley projects.
7)Disasters - Mapping flood inundated area, damage assessment
8)Digital elevation models - Contour maps, Slope maps
Nadergul area
in 2003
Nadergul
area
in 2019
Evolution of satellite Remote Sensing in India
• Following the successful demonstration flights of Bhaskara-1 and
Bhaskara-2 - experimental Earth observation satellites developed
and built by ISRO (Indian Space Research Organization) - and
launched in 1979 and 1981, respectively, India began the
development development of an indigenous indigenous IRS (Indian
Remote Sensing Satellite) program.
• India realized quite early that sustaining its space program in the
long run would depend on indigenous technological capabilities (in
particular, US export restrictions made this clear).
• India under its different earth observation missions and
programmes has launched varieties of satellites which have been
proved to be an indispensable tool for natural resource mapping,
monitoring ,management and planning including environmental
assessment at global, regional and local levels.
• The success of missions and developmental programmes
programmes have been based on its judicious judicious scientific
scientific approach of selecting multi- space platform,
multiresolution, and synoptic viewing capabilities.
• Keeping this in mind, besides building satellites, India embarked as
well on satellite launch vehicle development in the early 1970s.
• As a consequence, India has two very capable launch systems at
the start of the 21st century, namely PSLV (Polar Satellite Launch
Vehicle) and GSLV (Geosynchronous Satellite Launch Vehicle).
• IRS is the integrated LEO (Low Earth Orbit) element of India's
NNRMS (National Natural Resources Management System) with
the objective objective to provide provide a long -term space
borne operational capability to India for the observation and
management of the country's natural resources (applications in
agriculture, hydrology, geology, drought and flood monitoring,
marine studies, snow studies, and land use).
• The intend of the program is to create an environment of new
perspectives for the Indian research community as a whole, to
stimulate the development of new technologies and applications,
and to utilize the Earth resources in More meaningful ways.
• Note: The INSAT system is India's GEO (Geosynchronous Earth
Orbit) element, providing for simultaneous domestic
communications and earth observation functions.
Satellite Launch Date Launch Vehicle Type of Satellite
SARAL 25.02.2013 PSLV-C20 Earth Observation Satellite
RISAT-1 26.04.2012 PSLV-C19 Earth Observation Satellite
Jugnu 12.10.2011 PSLV-C18 Experimental / Small
Satellite
SRMSat 12.10.2011 PSLV-C18 Experimental / Small
Satellite
Megha-Tropiques 12.10.2011 PSLV-C18 Earth Observation Satellite
GSAT-12 15.07.2011 PSLV-C17 Geo-Stationary Satellite
GSAT-8 21.05.2011 Ariane-5
VA-202 Geo-Stationary
Satellite
RESOURCESAT-2 20.04.2011 PSLV-C16 Earth Observation Satellite
YOUTHSAT 20.04.2011 PSLV-C16 Experimental / Small
Satellite
GSAT-5P 25.12.2010 GSLV-F06 Geo-Stationary Satellite
STUDSAT 12.07.2010 PSLV-C15 Experimental / Small
Satellite
CARTOSAT-2B 12.07.2010 PSLV-C15 Earth Observation Satellite
GSAT-4 15.04.2010 GSLV-D3 Geo-Stationary Satellite
Oceansat-2 23.09.2009 PSLV-C14 Earth Observation Satellite
ANUSAT 20.04.2009 PSLV-C12 Experimental / Small
Satellite
RISAT-2 20.04.2009 PSLV-C12 Earth Observation Satellite
Chandrayaan-1 22.10.2008 PSLV-C11 Space Mission
CARTOSAT - 2A 28.04.2008 PSLV-C9 Earth Observation Satellite
IMS-1 28.04.2008 PSLV-C9 Earth Observation Satellite
INSAT-4B 12.03.2007 Ariane-5ECA Geo-Stationary Satellite
CARTOSAT - 2 10.01.2007 PSLV-C7 Earth Observation Satellite
SRE - 1 10.01.2007 PSLV-C7 Experimental / Small
Satellite
INSAT-4CR 02.09.2007 GSLV-F04 Geo-Stationary Satellite
Indian Remote Sensing Satellites
INSAT-4CR 02.09.2007 GSLV-F04 Geo-Stationary Satellite
INSAT-4C 10.07.2006 GSLV-F02 Geo-Stationary Satellite
INSAT-4A 22.12.2005 Ariane-5GS Geo-Stationary Satellite
HAMSAT 05.05.2005 PSLV-C6 Experimental / Small Satellite
CARTOSAT-1 05.05.2005 PSLV-C6 Earth Observation Satellite
EDUSAT (GSAT-3) 20.09.2004 GSLV-F01 Geo-Stationary Satellite
Resourcesat-1(IRS-P6) 17.10.2003 PSLV-C5 Earth Observation Satellite
INSAT-3A 10.04.2003 Ariane-5G Geo-Stationary Satellite
INSAT-3E 28.09.2003 Ariane-5G Geo-Stationary Satellite
INSAT-1B 30.08.1983 Shuttle [PAM-D] Geo-Stationary Satellite
Rohini (RS-D2) 17.04.1983 SLV-3 Earth Observation Satellite
INSAT-1A 10.04.1982 Delta 3910
PAM-D
Geo-Stationary Satellite
Bhaskara-II 20.11.1981 C-1 Intercosmos Earth Observation Satellite
Ariane Passenger
Payload Experiment
(APPLE)
19.06.1981 Ariane-1(V-3) Geo-Stationary Satellite
Rohini (RS-D1)
31.05.1981 SLV-3 Earth Observation Satellite
Rohini (RS-1) 18.07.1980 SLV-3 Experimental / Small
Satellite
Rohini
Technology
Payload (RTP)
10.08.1979 SLV-3 Experimental / Small
Satellite
Bhaskara-I 07.06.1979 C-1 Intercosmos Earth Observation Satellite
Aryabhata 19.04.1975 C-1 Intercosmos Experimental / Small
Satellite
Basic Principle Of Remote Sensing
Objects and surfaces can be recognized and distinguished based on the
radiant energy emitted/reflected by them. This principle underpins
remote sensing, which detects and records the radiant energy for further
study.
Different objects and surfaces, like water, soil, or plants, return energy in
different electromagnetic bands in different quantities. What factors
actually impact this? Here are the main ones:
properties of the object: chemical and physical composition, the kind of
material, and surface roughness;
properties of the radiation: the radiant intensity, incidence angle, and
wavelength.
Diverse disciplines come together in remote sensing, including optical
science, photography, spectroscopy, electronics, computer science,
telecommunications, satellite launch technology, and more.
Elements Of Remote Sensing
Incident radiation interacts with the objects of study and is captured by the
sensor. A good example of this are the imaging systems, which typically entail
the elements of remote sensing listed below. Yet, it’s important to remember
that remote sensing also includes non-imaging sensor types and the detection
of released energy.
The following seven basic components form the backbone of remote sensing :
Source of energy/illumination. A source of energy must be available either to
illuminate the object of interest or to supply electromagnetic radiation to it.
Radiation/energy and the atmosphere. Throughout its path from its origin to its
destination, the radiation will interact with atmospheric particles. Once the
energy passes from the object to the sensor, a second interaction may take
place.
Elements Of Remote Sensing
Object of study. When the energy finally reaches its target, their interaction is
determined by the characteristics of the radiation itself and the object.
Radiation-recording sensor. A sensor that is physically apart from the object of
study must pick up the electromagnetic radiation the target emits or reflects.
Data processing facility. The sensor’s readings must be sent (usually
electronically) to a data receiving and processing facility, where the measured
energy is translated into a usable image.
Analysis and interpretation. Following data processing in remote sensing, the
image is analyzed and interpreted visually and/or digitally to acquire
information about the object of study.
Practical use. Lastly, the process involves putting the information we’ve
learned from the images to good use to gain a deeper understanding of the
target, uncover previously unknown facts, or aid in issue-solving.
REMOTE SENSING TERMINOLOGY
Atmospheric Interactions with Electromagnetic Radiation
All electromagnetic radiation detected by a remote sensor
has to pass through the atmosphere twice, before and after
its interaction with earth's atmosphere. This passage will
alter the speed, frequency, intensity, spectral distribution,
and direction of the radiation. As a result atmospheric
scattering and absorption occur (Curran, 1988). These
effects are most severe in visible and infrared wavelengths,
the range very crucial in remote sensing.
During the transmission of energy through the atmosphere, light
interacts with gases and particulate matter in a process called
atmospheric scattering. The two major processes in scattering are
selective scattering and non-selective scattering. Rayleigh, Mie and
Raman scattering are of selective type. Non selective scattering is
independent of wavelength. It is produced by particles whose radii
exceed 10 Ilm, such as, water droplets and ice fragments present the
clouds. This type of scattering reduces the contrast of the image. While
passing through the atmosphere, electromagnetic radiation is scattered
and absorbed by gasses and particulates.
Besides the major gaseous components like molecular nitrogen and
oxygen, other constituents like water vapour, methane, hydrogen,
helium and nitrogen compounds play an important role in
modifying the incident radiation and reflected radiation. This causes
a reduction in the image contrast and introduces radiometric errors.
Regions of the electromagnetic spectrum in which the atmosphere
is transparent are called atmospheric windows.
Energy interactions with Earth's surface materials
When electromagnetic energy is incident on any feature of
earth's surface, such as a water body, various fractions of energy
get reflected, absorbed, and transmitted as shown in Fig. 2.13.
Applying the principle of conservation of energy, the
relationship can be expressed as:
All energy components are functions of wavelength, (I). In remote
sensing, the amount of reflected energy ER(A.) is more important than
the absorbed and transmitted energies. Therefore, it is more
convenient to rearrange these terms like
Above equation is called balance equation
From this mathematical equation, two important points can be
drawn. Firstly,
transmittance and can be denoted as p(A.), cx.(A.) and y(A.). Simply, it can be
understood that, the measure of how much electromagnetic radiation is
reflected off a surface is called its reflectance. The reflectance range lies
between 0 and 1. A measure of 1.0 means that 100% of the incident radiation
is reflected off the surface, and a measure '0' means that 0% is reflected. The
reflectance characteristics are quantified by "spectral reflectance, p(A.) which
is expressed as the following ratio:
GST - First Class Intro, Unit 1 and Unit 2 Part.pptx
GST - First Class Intro, Unit 1 and Unit 2 Part.pptx
GST - First Class Intro, Unit 1 and Unit 2 Part.pptx

More Related Content

Similar to GST - First Class Intro, Unit 1 and Unit 2 Part.pptx

chapter19. Terrestrial and Close-Range Photogrammetry.pdf
chapter19. Terrestrial and Close-Range Photogrammetry.pdfchapter19. Terrestrial and Close-Range Photogrammetry.pdf
chapter19. Terrestrial and Close-Range Photogrammetry.pdf
ssuser3f7a17
 

Similar to GST - First Class Intro, Unit 1 and Unit 2 Part.pptx (20)

What is photogrammetry overview and resources
What is photogrammetry overview and resourcesWhat is photogrammetry overview and resources
What is photogrammetry overview and resources
 
Photogrammetry amir
Photogrammetry amirPhotogrammetry amir
Photogrammetry amir
 
REMOTE SENSING_TOPIC_II_2023.pptx
REMOTE SENSING_TOPIC_II_2023.pptxREMOTE SENSING_TOPIC_II_2023.pptx
REMOTE SENSING_TOPIC_II_2023.pptx
 
GPS & GIS
GPS & GISGPS & GIS
GPS & GIS
 
Lecture01: Introduction to Photogrammetry
Lecture01: Introduction to PhotogrammetryLecture01: Introduction to Photogrammetry
Lecture01: Introduction to Photogrammetry
 
chapter19. Terrestrial and Close-Range Photogrammetry.pdf
chapter19. Terrestrial and Close-Range Photogrammetry.pdfchapter19. Terrestrial and Close-Range Photogrammetry.pdf
chapter19. Terrestrial and Close-Range Photogrammetry.pdf
 
PHOTOGRAMMETRY.pptx
PHOTOGRAMMETRY.pptxPHOTOGRAMMETRY.pptx
PHOTOGRAMMETRY.pptx
 
Working of photogrammetry and remote sensing
Working of photogrammetry and remote sensingWorking of photogrammetry and remote sensing
Working of photogrammetry and remote sensing
 
Geographic information system and remote sensing
Geographic information system and remote sensingGeographic information system and remote sensing
Geographic information system and remote sensing
 
Geographic information system and remote sensing
Geographic information system and remote sensingGeographic information system and remote sensing
Geographic information system and remote sensing
 
photogrammetry.ppt
photogrammetry.pptphotogrammetry.ppt
photogrammetry.ppt
 
Photogrammetry
PhotogrammetryPhotogrammetry
Photogrammetry
 
Photogrammetry
PhotogrammetryPhotogrammetry
Photogrammetry
 
Photographic survey
Photographic survey Photographic survey
Photographic survey
 
PHOTOGRAMETRY SURVEY
PHOTOGRAMETRY SURVEYPHOTOGRAMETRY SURVEY
PHOTOGRAMETRY SURVEY
 
6.3_DEM CREATION AND 3D MODELING.pptx
6.3_DEM CREATION AND 3D MODELING.pptx6.3_DEM CREATION AND 3D MODELING.pptx
6.3_DEM CREATION AND 3D MODELING.pptx
 
AERIAL PHOTOGRAMMETRY.pptx
AERIAL PHOTOGRAMMETRY.pptxAERIAL PHOTOGRAMMETRY.pptx
AERIAL PHOTOGRAMMETRY.pptx
 
PHOTOGRAMMETIC SURVEYING
PHOTOGRAMMETIC SURVEYING PHOTOGRAMMETIC SURVEYING
PHOTOGRAMMETIC SURVEYING
 
Introduction to Aerial Photogrammetry
Introduction to Aerial PhotogrammetryIntroduction to Aerial Photogrammetry
Introduction to Aerial Photogrammetry
 
Aerial Photogrammetry 01
Aerial Photogrammetry 01Aerial Photogrammetry 01
Aerial Photogrammetry 01
 

Recently uploaded

1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
AldoGarca30
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
mphochane1998
 
Hospital management system project report.pdf
Hospital management system project report.pdfHospital management system project report.pdf
Hospital management system project report.pdf
Kamal Acharya
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
HenryBriggs2
 

Recently uploaded (20)

1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
 
Introduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaIntroduction to Serverless with AWS Lambda
Introduction to Serverless with AWS Lambda
 
UNIT 4 PTRP final Convergence in probability.pptx
UNIT 4 PTRP final Convergence in probability.pptxUNIT 4 PTRP final Convergence in probability.pptx
UNIT 4 PTRP final Convergence in probability.pptx
 
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKAR
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKARHAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKAR
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKAR
 
Linux Systems Programming: Inter Process Communication (IPC) using Pipes
Linux Systems Programming: Inter Process Communication (IPC) using PipesLinux Systems Programming: Inter Process Communication (IPC) using Pipes
Linux Systems Programming: Inter Process Communication (IPC) using Pipes
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
 
School management system project Report.pdf
School management system project Report.pdfSchool management system project Report.pdf
School management system project Report.pdf
 
Introduction to Geographic Information Systems
Introduction to Geographic Information SystemsIntroduction to Geographic Information Systems
Introduction to Geographic Information Systems
 
Path loss model, OKUMURA Model, Hata Model
Path loss model, OKUMURA Model, Hata ModelPath loss model, OKUMURA Model, Hata Model
Path loss model, OKUMURA Model, Hata Model
 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the start
 
Post office management system project ..pdf
Post office management system project ..pdfPost office management system project ..pdf
Post office management system project ..pdf
 
8086 Microprocessor Architecture: 16-bit microprocessor
8086 Microprocessor Architecture: 16-bit microprocessor8086 Microprocessor Architecture: 16-bit microprocessor
8086 Microprocessor Architecture: 16-bit microprocessor
 
Hostel management system project report..pdf
Hostel management system project report..pdfHostel management system project report..pdf
Hostel management system project report..pdf
 
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptxHOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
 
Augmented Reality (AR) with Augin Software.pptx
Augmented Reality (AR) with Augin Software.pptxAugmented Reality (AR) with Augin Software.pptx
Augmented Reality (AR) with Augin Software.pptx
 
Max. shear stress theory-Maximum Shear Stress Theory ​ Maximum Distortional ...
Max. shear stress theory-Maximum Shear Stress Theory ​  Maximum Distortional ...Max. shear stress theory-Maximum Shear Stress Theory ​  Maximum Distortional ...
Max. shear stress theory-Maximum Shear Stress Theory ​ Maximum Distortional ...
 
Hospital management system project report.pdf
Hospital management system project report.pdfHospital management system project report.pdf
Hospital management system project report.pdf
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
 
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
 
AIRCANVAS[1].pdf mini project for btech students
AIRCANVAS[1].pdf mini project for btech studentsAIRCANVAS[1].pdf mini project for btech students
AIRCANVAS[1].pdf mini project for btech students
 

GST - First Class Intro, Unit 1 and Unit 2 Part.pptx

  • 1. ANURAG UNIVERSITY DEPARTMENT OF CIVIL ENGINEERING B.Tech IV Year I Semester-Section A Batch – 2020 to 2024 Subject Geospatial Technology Faculty Poola Nagesh Assistant Professor Commencement of Instructions – 12th June 2023 Last Date of Instructions- 7th October 2023
  • 3.
  • 4.
  • 5.
  • 6.
  • 7. Unit 1 • Introduction to Photogrammetry: Principle and types of aerial photographs, stereoscopy, Map Vs Mosaic, ground control, Parallax measurements for height, and determinations. • Remote Sensing – I: Basic concepts and foundation of remote sensing – elements involved in remote sensing, electromagnetic spectrum, remote sensing terminology, and units.
  • 8. UNIT-II Remote Sensing – II: Energy resources, energy interactions with earth surface features and atmosphere, resolution, sensors and satellite visual interpretation techniques, basic elements, converging evidence, interpretation for terrain evaluation, spectral properties of water bodies, introduction to digital data analysis.
  • 10. Photogrammetry  Introduction & Basic Terms of Pho to grammet ry  Sign ificance of Ph otogrammet ry  Types of Ph o togrammet ry  Types of P h o to grap hs  Scale of Vertical Photograph  Relief Displacement
  • 11. Introduction to Photogrammetry Photogrammetry is a technique used to create accurate 3D models or measurements of real-world objects or environments by analyzing photographs or images. It involves extracting information about the shape, size, and position of objects or features from a collection of overlapping images. The basic principle behind photogrammetry is triangulation. By capturing multiple images of an object or a scene from different angles, the software can determine the position of each point in 3D space by analyzing the geometric relationships between the images. This process involves identifying common features in the images and matching them to establish correspondences.
  • 12.
  • 13. Introduction • There is no universally accepted definition of photogrammetry. • The name “photogrammetry" is derived from the three Greek words – phos or phot which means light, – gramma which means letter or something drawn, and – metrein, which means to measure. • Photogrammetry is the science of obtaining reliable information about the properties of surfaces and objects without physical contact with the objects, and of measuring and interpreting this information. • Is an art of obtaining reliable information about physical objects an d the environment through processes of recording, measuring and interpreting photographic images. • The principal point of each photograph is used as a fixed station a n d rays are drawn to get points of intersections very similar to those used in plane table. • As its name implies, is a 3-dimensional coordinate measuring technique that uses photographs as the fundamental medium for metrology (or measurement).
  • 14. What is Photogrammetry? Topographic Photogrammetry Close range Photogrammetry Used for mapping earth or planets Used for industrial measurement Imaging system is based on an aircraft / spacecraft Imaging system is handheld Target is the ground surface Target is the object being measured Image data is processed to create new spatial information products Image data is processed to make precise measurements
  • 15.
  • 18. • The idea of capturing photos started around 6th century • The main stages of development in this area started around 1850 • The development stages can be categorized as Analog  Analytical  Digital photogrammetry History & Development
  • 19. Mile stones of Photogrammetry • Four major phases are directly related to the technological inventions of photography, airplanes, computers and electronics. • First generation, from the middle to the end of last century, was very much a pioneering and experimental phase with remarkable achievements in terrestrial and balloon photogrammetry.
  • 20. Analog Photogrammetry • Optical or mechanical instruments were used to reconstruct 3 dimensional geometry from 2 overlapping photographs. • The main product during this phase was topographic maps
  • 21. • Computer has replaced some expensive optical and mechanical components. • Resulting devices were analog / digital hybrids. • Output of analytical photogrammetry were Topographic maps, digital maps like DEM (digital elevation model) etc. Analytical Photogrammetry
  • 22. Digital Photogrammetry • Digital photogrammetry is the art of using computers to obtain the measurements of objects in a photograph • The fourth generation digital photogrammetry is rapidly emerging as a new discipline in photogrammetry. In contrast to all other phases, digital images are used instead of aerial photographs.
  • 23. Photogrammetry What does Photogrammetry requires • Planning and taking photographs • Processing the photographs • Measuring photos and reducing measurements to produce end results. Field Applications • Used to conduct topographical survey or engineering surveys • Suitable for mountains / hilly areas with little vegetation • Used for geological mapping which includes identification of land forms, rock type etc • Used in urban, regionalplanning,disaster management, transportation etc
  • 24. P h o t o g r a m m e t r y • Advantages – Used in different fields, such as topographic mapping, engineering, manufacturing. – Covers larger areas – Less time consuming – Can reach inaccessible and restricted areas – Economical for large areas and Easy to interpret and understand. • Disadvantages – Complex system, highly trained human resource needed – Initial investment cost , Heavy and sophisticated equipments needed – Weather dependent.
  • 25. The process of photogrammetry typically involves several steps: Image Acquisition: Capturing a series of overlapping photographs or images of the object or scene from different angles. The images should have sufficient detail and overlap to ensure accurate reconstruction. Image Preprocessing: Cleaning and preparing the images for further analysis. This may involve adjusting image parameters, such as brightness, contrast, and color balance, as well as removing lens distortion. Feature Extraction: Identifying and extracting common features, such as corners or edges, from the images. These features serve as reference points for matching and establishing correspondences between images. Image Matching: Comparing the extracted features across different images to find corresponding points. This process involves algorithms that analyze the visual similarities between the features and establish accurate correspondences.
  • 26. Bundle Adjustment: Refining the camera positions and orientations by minimizing the errors in the matching process. Bundle adjustment ensures the accuracy and consistency of the reconstructed 3D model. Dense Point Cloud Generation: Creating a dense representation of the object or scene by estimating the 3D coordinates of numerous points on its surface. This is achieved by triangulating the matched features and estimating their positions in 3D space. Surface Reconstruction: Creating a continuous and detailed surface model from the dense point cloud. Various algorithms, such as Delaunay triangulation or mesh generation methods, can be used to create a mesh representing the object's surface. Texture Mapping: Applying the original images onto the reconstructed surface to generate a realistic and textured 3D model. This process enhances the visual appearance of the model by projecting the images onto the corresponding surface areas.
  • 27. Photogrammetry has a wide range of applications in various industries, including architecture, engineering, cultural heritage preservation, virtual reality, and video game development. It enables the creation of accurate 3D models, measurements, and visualizations, providing valuable data for analysis, documentation, and immersive experiences.
  • 29. T Y P ES O F P H OTOG RAP H S • The photographs used in photogrammetry may be broadly classified into two types depending upon the camera position at the time of photography. • The types are- Terrestrial Photographs and Aerial Photographs
  • 30. Aerial photogrammetry is commonly used in mapping, surveying, and remote sensing applications. It involves capturing images from an elevated position, such as an aircraft or a drone, to create large-scale maps or 3D models of landscapes, buildings, or entire cities. Terrestrial photogrammetry, on the other hand, focuses on capturing images from ground-level perspectives. It can be used to create detailed models of smaller objects, such as archaeological artifacts, sculptures, or architectural elements. Terrestrial photogrammetry often requires careful planning of image acquisition and the use of specialized equipment, such as calibrated cameras and precise measurement tools.
  • 31. Principle and types of aerial photographs The principle of aerial photography involves capturing photographs of the Earth's surface from an elevated position, typically from an aircraft or a drone. These photographs provide a bird's-eye view perspective and are used in various applications such as mapping, land surveying, environmental monitoring, and urban planning. The key principles of aerial photography include: Overlapping Coverage: Aerial photographs are captured with a significant amount of overlap between successive images. This overlap ensures that there are common features visible in multiple photographs, allowing for accurate measurement and reconstruction of the terrain or objects.
  • 32. Scale and Ground Resolution: Aerial photographs have a specific scale determined by the altitude of the aircraft and the focal length of the camera. Ground resolution refers to the level of detail captured in the photograph, usually expressed as the size of a ground feature represented by a pixel in the image. Vertical Perspective: Aerial photographs are typically captured with a vertical perspective, meaning that the camera axis is pointed straight down towards the Earth's surface. This perspective minimizes distortions and simplifies the process of geometric correction and measurement. Stereoscopic Vision: Aerial photographs are often captured in pairs or sets to enable stereoscopic viewing. Stereoscopic vision allows for depth perception and the creation of three-dimensional models by observing the displacement of features in the overlapping photographs.
  • 33. There are different types of aerial photographs used in various applications. The main types include: Vertical Aerial Photographs: These are captured with the camera axis pointed straight down, providing a vertical perspective. Vertical photographs are commonly used for mapping, land surveying, and photogrammetric analysis. They provide accurate measurements and allow for the creation of ortho photos (geometrically corrected aerial images) and digital elevation models (DEMs). Oblique Aerial Photographs: In oblique aerial photography, the camera is tilted at an angle, capturing images with a non-vertical perspective. Oblique photographs are often used for visual interpretation, architectural photography, and landscape analysis. They provide a more artistic or dramatic representation of the subject but are less suitable for precise measurement or mapping.
  • 34. Near-vertical Aerial Photographs: Near-vertical photographs are captured with a slight tilt, providing a perspective that is between vertical and oblique. These photographs offer a balance between the accuracy of vertical photographs and the visual context provided by oblique photographs. They are useful for applications such as urban planning, site analysis, and environmental studies.
  • 35. Thermal Infrared Aerial Photographs: Thermal infrared aerial photography involves capturing images in the thermal infrared spectrum. These photographs can reveal variations in temperature, which can be useful in applications such as environmental monitoring, agriculture, and heat loss analysis of buildings. Multispectral Aerial Photographs: Multispectral aerial photographs are captured using cameras equipped with filters sensitive to different parts of the electromagnetic spectrum. These photographs provide information beyond the visible range, enabling analysis of vegetation health, soil composition, and other environmental factors. Each type of aerial photograph has its own advantages and applications, and the selection depends on the specific needs of the project or study.
  • 39. • Aerial photographs can beclassified as oblique or vertical, based on the orientation of the camera relative to the ground during the time of acquisition. Vertical Photographs • These photographs are taken from the air with the axis of the camera vertical or nearly vertical. • A truly vertical Photograph closely resembles a map. • These are utilized for the compilation of topographic and engineering surveys on various scales.
  • 42. Oblique vs Vertical Photographs
  • 43. Oblique Photo Vertical Photo Oblique vs Vertical Photographs
  • 44. Principle and types of stereoscopy The principle of stereoscopy is based on the concept of binocular vision, which is the ability of humans and some animals to perceive depth and three-dimensional information by processing the slightly different views seen by each eye. Stereoscopy mimics this natural visual perception by presenting two slightly offset images to each eye, creating the illusion of depth and a sense of three-dimensionality. The key principle of stereoscopy involves the following: Stereoscopic Pair: A stereoscopic pair consists of two images, called left and right views, which are captured or created from slightly different perspectives. These images represent what each eye would see if viewing the scene separately.
  • 45. Parallax: Parallax is the apparent shift or displacement of objects when viewed from different positions. In stereoscopy, the offset between the left and right views creates parallax, and this parallax is used to generate the perception of depth. Binocular Fusion: Binocular fusion is the brain's ability to merge the two slightly offset images into a single perception of a three-dimensional scene. By aligning and combining corresponding points in the left and right views, the brain reconstructs the depth and spatial relationships of the objects in the scene.
  • 46. There are different types of stereoscopy techniques used to create the illusion of depth and provide a three-dimensional viewing experience. Some common types include: Anaglyph Stereoscopy: Anaglyph stereoscopy involves using specially filtered glasses with one red lens and one cyan, green, or blue lens. The left and right views are presented as differently colored images (usually red and cyan). When viewed through the corresponding color filters, the brain merges the images and perceives depth. Polarized Stereoscopy: Polarized stereoscopy uses glasses with different polarizing filters for each eye (usually circular or linear polarization). The left and right views are projected or displayed with matching polarizations. The polarized glasses filter out the respective views, allowing each eye to see the intended image, resulting in a stereoscopic effect.
  • 47. Shutter Glasses Stereoscopy: Shutter glasses, also known as active shutter glasses, are synchronized with a display device that alternates between showing the left and right views rapidly. The glasses contain liquid crystal lenses that open and close in sync with the display, allowing each eye to see the correct image. This technique provides high-quality stereoscopic images but requires compatible hardware. Autostereoscopy: Autostereoscopy refers to the ability to view three- dimensional images without the need for specialized glasses or additional devices. Various autostereoscopic methods exist, including lenticular lenses, parallax barriers, and holography. These techniques use optical elements to direct different views to each eye, enabling the perception of depth without the need for additional aids.
  • 48. Stereoscopy has applications in various fields, including entertainment (3D movies, virtual reality), scientific visualization, medical imaging, and product design. It provides an immersive and realistic viewing experience by adding depth and enhancing the perception of the three- dimensional world.
  • 49.
  • 50.
  • 51.
  • 52.
  • 53. What are 3D Glasses? These glasses utilize special red / cyan lenses to interpret the image. These lenses produce the images you see by color filtering the layered image that you're actually looking at. While one lens filters out all the red in an image, the other lense filters out the cyan, causing your brain to see the picture in 3D
  • 54. Map Vs Mosaic Map and mosaic are both terms used in the context of geospatial data and imagery, but they refer to different concepts: Map: A map is a visual representation or depiction of geographic information, typically on a flat surface. It can be a two-dimensional representation of the Earth's surface or a specific area, showing various features such as landforms, roads, rivers, political boundaries, and other relevant information. Maps are created using cartographic techniques and can be used for navigation, analysis, planning, and communication of spatial information. Maps can be created through various means, including manual drawing, computer-aided design (CAD), or digital mapping software. They often involve the aggregation and symbolization of different data layers to convey specific information and facilitate understanding of spatial relationships.
  • 55.
  • 56.
  • 57. Mosaic: A mosaic, in the context of geospatial imagery, refers to the process of combining multiple individual images or photographs to create a seamless and continuous image of a larger area. It involves aligning and blending the individual images to create a composite representation of the entire scene. Mosaicking is commonly used in remote sensing and aerial photography applications, where multiple overlapping images are captured to cover large areas. By stitching these images together, a single mosaic image is created, which provides a comprehensive view of the entire region. Mosaics can be created with various software tools and algorithms that handle image registration, color correction, and blending.
  • 58. Mosaics are useful for visual interpretation, analysis, and mapping purposes. They allow for a comprehensive view of large areas, enabling the identification and understanding of spatial patterns, land cover changes, and other geospatial features. Mosaics can also serve as base maps for further analysis or overlaying additional data layers. In summary, while a map represents spatial information in a two- dimensional format, a mosaic is a composite image created by stitching together multiple individual images to cover a larger area. Maps provide a general representation of geographic features and their relationships, while mosaics offer a detailed visual depiction of the actual surface as captured by imagery.
  • 59.
  • 60. Ground control Ground control, in the context of geospatial data and mapping, refers to the use of known reference points or features on the Earth's surface to establish the accuracy and spatial positioning of remotely sensed imagery, aerial photographs, or other geospatial data. Ground control points (GCPs) are typically physical features with known coordinates that are visible in both the imagery and on the ground. These GCPs are used to align, georeference, and rectify the imagery to a known coordinate system.
  • 61. Ground control The process of ground control involves the following steps: Selection of Ground Control Points: Several GCPs are selected on the ground, and their coordinates are measured using surveying techniques such as Global Navigation Satellite Systems (GNSS) or Total Station. GCPs should be easily identifiable and have stable positions over time. Identification in Imagery: The selected GCPs are identified in the imagery or remotely sensed data. This can be done manually by visually matching the GCPs in the imagery or through automated algorithms that detect and match features between the ground and the imagery.
  • 62. Georeferencing: The coordinates of the GCPs on the ground are associated with their corresponding locations in the imagery. This process involves determining the transformation parameters to align the imagery with the desired coordinate system. Common transformation methods include affine, polynomial, or projective transformations. Rectification: Once the imagery is georeferenced, it can be rectified to remove any distortions caused by terrain relief or sensor characteristics. Rectification ensures that the image has a uniform scale and represents the Earth's surface accurately. It involves resampling the imagery to a regular grid and correcting for any geometric distortions.
  • 63. Ground control points are essential in various applications, including: Orthorectification: GCPs are used to rectify aerial photographs or satellite imagery to produce orthorectified images, which have a uniform scale and can be used for accurate measurement, mapping, and analysis. Geospatial Data Integration: GCPs enable the alignment and integration of different geospatial data sources, such as aerial imagery, LiDAR [Light Detection and Ranging] data, and satellite imagery, to create a consistent and accurate geospatial dataset. Change Detection: GCPs provide a reference framework to detect and quantify changes over time by comparing imagery captured at different periods.
  • 64. Ground control points are essential in various applications, including: Accuracy Assessment: GCPs serve as ground truth points for evaluating the accuracy of geospatial products, such as orthophotos, maps, or digital elevation models (DEMs). Overall, ground control plays a critical role in establishing the accuracy, georeferencing, and rectification of geospatial data, ensuring that the data can be used reliably for mapping, analysis, and decision-making.
  • 65. Parallax measurements for height and determinations Parallax measurements can be used to determine the height or distance of an object or feature by exploiting the principle of triangulation. The basic concept involves observing an object from two different positions and measuring the angular displacement or parallax between the two views. By knowing the baseline distance between the two viewpoints and the measured parallax angle, it is possible to calculate the height or distance to the object using trigonometry. Here's a general overview of how parallax measurements can be used for height and distance determinations: Baseline Setup: Establish two observation points or viewpoints from which the object or feature of interest can be observed. The distance between these viewpoints is referred to as the baseline distance. It is crucial to accurately measure or know the baseline distance for accurate results.
  • 66. Observations: From each viewpoint, observe the object or feature, making sure it is visible and identifiable in both views. Note the angular displacement or parallax between the two views. Parallax is the apparent shift of the object's position when viewed from different positions. Parallax Angle Measurement: Measure the parallax angle, which is the angular displacement of the object as observed from the two viewpoints. This can be done using instruments such as theodolites, total stations, or specialized photogrammetric equipment. Trigonometric Calculation: With the baseline distance and the measured parallax angle, trigonometric calculations can be used to determine the height or distance to the object. The specific calculations depend on the geometry of the setup and the type of parallax being measured.
  • 67. Determining Object Height: If the object is on or near the Earth's surface, the height can be calculated using simple trigonometry. The height (h) of the object is given by the formula: h = b * tan(α), where b is the baseline distance and α is the parallax angle. Determining Object Distance: If the object is at a significant distance and its height is known, the distance (d) to the object can be calculated using trigonometry. The distance is given by the formula: d = h / tan(α), where h is the object height and α is the parallax angle. It's important to note that accurate measurements of the baseline distance and the parallax angle are crucial for obtaining reliable height or distance determinations. Additionally, factors such as atmospheric conditions, instrument accuracy, and object visibility should be considered to ensure accurate results. Parallax measurements can be applied in various fields, including surveying, photogrammetry, astronomy, and remote sensing. They provide a method for estimating heights or distances without directly accessing the object, making them useful for inaccessible or remote locations.
  • 68. • The scale of the vertical photograph is the ratio of a distance on the photo to the corresponding distance to the ground. • Scale varies for points of different elevation • Scale remains constant for only those points having same elevation. • If all points are on the same elevation then the scale is given as follows SCALE OF VERTICAL PHOTOGRAPH H= flying height of the camera f = focal length of camera h= height of ground above mean sea level
  • 69. • Case 2: If A and B are two points having elevations ha and hb respectively above mean sea level. • Scale of photograph at elevation ha • Scale of photograph at elevation hb Scale of photograph for different elevations (h) This can be represented by Representative Fraction (Rn) also as SCALE OF VERTICAL PHOTOGRAPH
  • 70. SCALE OF VERTICAL PHOTOGRAPH • Datum Scale – If all points are projected at Mean Sea Level • Average Scale – If all points are projected on a plane representing the average elevation.
  • 71. Coordinate of Point A and B on ground in plan are A  Xa, Ya & B Xb,Yb Corresponding points on photograph are A  xa, ya & B xb,yb For Point A from Similar Triangle For Point B So, Computation of Length of line between points of different elevation So X and Y of any point Length between two points A and B are given as
  • 72. Relief Displacement • When the ground is not horizontal then the scale of the photograph varies from point to point. Scale of an aerial photograph is partly a function of flying height. Therefore variations in elevation cause variations in scale of aerial photographs and every point on the photograph is displaced from its true position. • The effect of relief does not only cause a change in the scale but can also be considered as a component of image displacement. Higher the elevation of an object, the farther the object will be displaced from its actual position away from the principal point of the photograph (the point on the ground surface that is directly below the camera lens). Lower the elevation of an object, the more it will be displaced towards the principal point. This effect, called relief displacement. Due to different elevation of different points every point on photograph is displaced from their original position. This displacement is called relief displacement.
  • 73. • Consider point T is on top of a building and point B at the bottom. On a map, both points have identical X, Y coordinates; However, on the photograph they are imaged at different positions, namely in T and B’. The distance d between the two photo points is called relief displacement because it is caused by the elevation difference Δh between T and B. Relief Displacement
  • 74. Relief Displacement Relief Displacement = If Relief Displacement is known then height of the object h = dH / R
  • 75. Problem – • The distance from the principal point to an image on a photograph is 6.44cm and the elevation of the object above the datum is 250m. What is the relief displacement of the point if the datum scale is 1/10000 and the focal length of the camera is 20cm – The datum scale is given by Sd = 1/10000 ; r = 6.44cm, h = 250m But scale = focal length  1 = 20 H Height 100 00  H = (20 * 10000)/100  H = 2000 m above msl 10 0 Relief displacement (d) is given by  d = rh/H  d = (6.44 * 250)/100 * 2000  d = 0.805cm
  • 76. Photo scale • We use the representative fraction for scale expressions, in form of a ratio, e.g. 1 : 5,000. • The scale of a near vertical photograph can be approximated by the equation • Where mb is the photograph scale number, • c the calibrated focal length, and • H the flight height above mean ground elevation. • Note that the flight height H refers to the average ground • elevation. If it is with respect to the datum, then it is called flight altitude HA, with HA = H + h.
  • 77. • The photograph scale varies from point to point. • For example, the scale for point P can easily be determined as the ratio of image distance CP’ to object distance CP by • where • xP , yP are the photo-coordinates, • XP , YP , ZP the ground coordinates of point P , and • XC, YC, ZC the coordinates of the projection center C in the ground coordinate system. • The above equation takes into account any tilt and topographic variations of the surface (relief). Photo scale
  • 78. Why Flight Planning?? • To ensure coverage of an area to be mapped at the required scale without any gaps in between. • The product of a flight pan is basically a map • A flight plan will determine the spacing between successive photographs, locations of flight lines and start and end locations of each flight path.
  • 79. Flight Planning • Success of any photogrammetric project depends mainly on acquisition of good quality pictures. • Due to weather and terrain conditions, time frames for photography is limited. – Ideally cloud cover < 10 % is acceptable. – Clouds higher than the flying height might cast large shadows on the ground. – Windy days may cause excessive image motion and difficulties in camera and aircraft orientation. – Areas near to industries may be susceptible to atmospheric haze, smog • Re-flights can be very expensive and causes delay in the projects. • Therefore the flight path must be carefully planned and executed accordingly.
  • 81. Overlapping Imagery • Aerial photographs taken are rarely a single shot event. Multiple photographs are taken along a flight path to ensure complete photo coverage of an area. • To achieve this we must have overlap between photo images. The overlapping of photos end to end is termed end lap. • An end lap of 30% will avoid potential missing areas of coverage due to the effects of turbulence. • Imagery collected for use in stereo viewing, an end lap of 60% is considered ideal. • For block coverage of an area at a specific photo scale, it is often necessary to fly parallel strips of aerial photography. • The adjacent strips also overlap each other. This overlapping area is termed side lap and is generally specified at 30% to ensure good coverage.
  • 83.
  • 86. Side lap between adjacent strips
  • 87. Effect of topography on scale and overlap
  • 88. No of photographs required Let A = Total area to be photographed l = Length of photograph in the direction of flight w =Width of the photograph normal to direction of flight s = Scale of the photograph L = Net Ground Distance corresponding to l W = Net Ground Distance corresponding to w a = Net Ground Area covered by each photograph = L * W Pl = percentage overlap between successive photographs in direction of flight Pw = side lap Since each photograph has a longitudinal overlap of Pl, the actual ground length (L) covered by each photograph is given by L = (1-Pl) sl Similarly , the actual ground width (W) covered by each photograph is given by W = (1-Pw) sw Hence ground area covered by each photograph a= L * W = (1-Pl) sl * (1-Pw) sw Number of photographs required (N) = A/a
  • 89. • The scale of an aerial photograph is 1cm = 100m. The photograph size is 20cm * 20cm.Determine the number of photographs required to cover an area of 100sq.km, if the longitudinal lap is 60% and side lap is 30%. • GIVEN DATA  l=20cm, w=20cm,Pl = 0.60, Pw = 0.30,s=100 • The actual ground length covered by each photograph is L = (1-Pl)sl = (1-0.60)*100*20 = 800m or 0.8km • The actual ground width covered by each photograph is W = (1-Pw)sw = (1-0.30)*100*20 = 1400m or 1.4km • Net area covered by each photograph is a= L * W a = 0.8 * 1.4 a =1.12 Sq.km • Hence no of photographs required is N = A/a = 100/1.12 = 90
  • 90. BASIC TERMS USED IN PHOTOGRAMMETRY • Tilted Photograph: An aerial photograph taken with a camera having it’s optical axis tilted usually less than 3º from the vertical is known as tilted photograph. • Exposure (or air) station: The exact position of the front nodal point of the lens in the air at the instant of exposure. • Flying height (H): The elevation of the air station above the mean sea level is known as flying height of the aircraft. • Nadir Point (Plumb Point): The point where a plumb line dropped from the front nodal point strikes the Photograph
  • 91. BASIC TERMS USED IN PHOTOGRAMMETRY • Camera Axis: It is the line passing through the centre of the camera lens perpendicular both to the camera plate (negative) and the picture plane (photography). • Fiducial mark: A fiducial mark is one of two, three or four marks, located in contact with the photographic emulsion in a camera image plane to provide a reference line or lines for the plate measurement of images. • Principal Point: The point where a perpendicular dropped from the front nodal point strikes the photographs is known as principal point of photograph • Focal length: It is the perpendicular distance from the centre of the camera lens to either the picture plane or the camera plate.
  • 93. REMOTE SENSING  The study of something without making actual contact with the object  Making measurements of the Physical properties of an object from a remote distance  Satellite technology is an example of remote sensing  Satellites measure properties of the Earth and transmits the data to receiving stations REMOTE SENSING includes all methods and techniques used to gain qualitative and quantitative information about distant objects without coming into direct contact with these objects.  Look-Look, NO Touch
  • 94. Remote Sensing (RS) methods try to answer four basic questions: HOW MUCH of WHAT is WHERE? What is the SHAPE and EXTENT of ... ? (Area, Boundaries,Lineaments, ...) Has it CHANGED? What is the MIX of Objects
  • 95. HOW MUCH of WHAT is WHERE? WHAT: Type, Characteristic and Properties of Object. For example:- Water, Vegetation, Land; Temperature, Concentration, State of Development; Subtype, Species, Use of ... ; Includes determination of generic object type, character and property as well as it’s abstract meaning. DATAINTERPRETATION HOW MUCH: determine by simple COUNTING, measuring AREA covered or percentage of total area coverage. WHERE: Relate locations and area covered to either a standard map or to the actual location on the ‘ground’ where the object occurs. NOTE: WHERE also refers to a moment in time
  • 96. What is the SHAPE and EXTENT of ... ? (Area, Boundaries,Lineaments, ...) This extends the ‘WHERE’ to be a completely GEOMETRIC problem. MAP PRODUCTION methods are to be applied to the analysis of RS information. These include: Photogrammetric Methods: Identification and Delineation of Boundaries and Lineaments (Roads, Rivers, Fault Lines) Has it CHANGED? CHANGE may occur with progress of TIME. Change may be detected through comparison of observed states at different moments in time. => CHANGE DETECTION
  • 97. What is the MIX of Objects? The surface of the Earth is covered by objects like Soil, Water, Grass, Trees, Houses, Roads and so on. These are ‘GENERIC OBJECTS’. We know these well, but we also know objects like Open Forest, Residential and Industrial Estates, etc. Each of these ABSTRACT OBJECTS are made up of a typical collection of Generic Objects "Remote sensing is the science (and to some extent, art) of acquiring information about the Earth's surface without actually being in contact with it. This is done by sensing and recording reflected or emitted energy and processing, analyzing, and applying that information."
  • 99. The Process of Remote Sensing A. There are interactions with the atmosphere B. The energy reaches the target, or object on Earth being studied and interacts with the target based on the target’s properties. C. Energy scattered by or emitted from the target is then collected by the sensor D. The sun, or the satellite itself, is the energy source that provides electromagnetic energy
  • 100. The Process of Remote Sensing E. The sensor transmits the electronic information to a receiving and processing station. Here, it is processed into an image F. The processed image is then interpreted to learn about the target G. The information is applied so that we better understand the target, learn something new about the target, or solve a particular problem
  • 101.
  • 102. ELECTROMAGNETIC SPECTRUM Electromagnetic energy travels in waves and spans a broad spectrum from very long radio waves to very short gamma rays. The human eye can only detect only a small portion of this spectrum called visible light. A radio detects a different portion of the spectrum, and an x-ray machine uses yet another portion.
  • 104. ELECTROMAGNETIC SPECTRUM Radiation energy that is emitted in wave form by all substances is the basis for all remote sensing of the earth Electromagnetic Radiation Electromagnetic radiation consists of an electrical field, E and a magnetic field, M. Both of these fields travel at the speed of light, c. Different kind of electromagnetic radiation can be distinguished by wavelength and frequency.
  • 105.
  • 106. Energy Source and its Characteristics All objects whose temperature is greater than an absolute zero (273°k), emit radiation. All stars and planets emit radiation. Our chief star, the sun is almost a spherical body with a diameter of 1.39 x 10^6 km at a mean distance from the earth equal to 1.5 x 10^8 km. The continuous conversion of hydrogen to helium which is the main constituents of the sun, generates the energy that is radiated from the outer layers. If the energy received at the edge of earth's atmosphere were distributed evenly over the earth, it would give an average incident flux density of 1367 w/m2. This is known
  • 107.
  • 108. Visible Light ⚫ Visible light is light that our eyes can see ⚫ Visible light makes up an extremely small part of the electromagnetic spectrum ⚫ Range from about 0.4 to 0.7µm ⚫ Blue, red and green are the primary colors of light. All other colors can be made by combining them in various proportions. here for an interesting activity. ⚫ Each color has a different wavelength. Red has the longest wavelength and violet has the shortest wavelength. When all the waves are seen together, they make white light.
  • 109. INTERACTION OF EM RADIATION WITH ATMOSPHERE EM radiation must pass through atmosphere in order to reach earths surface and to the sensor after reflection and emission from earth surface features. Water vapour, oxygen, ozone, CO2, Aerosols etc present in atmosphere influence electro magnetic radiation through the mechanism of SCATTERING and ABSORPTION.
  • 110. INTERACTION OF EM RADIATION WITH ATMOSPHERE SCATTERING Unpredictable diffusionof radiation by molecules of gas, dust and smoke. Scattering reduces contrast and changes the spectral signatures of ground objects. Rayleigh’s Scattering: occurs when the diameter of the gas molecules or particles is much less than the wave length of radiation. Lesser the wave length, more is the scattering.(Sky appears blue during mid day and red during sunrise and sunset) Mie Scattering: occurs where the diameter of water vapour or dust particles approximately equals the wave length of radiation.
  • 111. Non Selective Scattering: occurs when the diameter of the particles is several times more than radiation wavelength(approx ten times). Pollen grains, Cloud droplets, Ice and snow crystals and rain drops are main sources of Non selective scattering. These scatter all the wavelengths of visible light with equal efficiency (This is the reason for cloud to appear in white colour) ABSORPTION ---Atmospheric absorption results the effective loss of energy due to atmospheric constituents. Oxygenis observed in ultraviolet region ; CO2 also prevents the energy reaching earth ; Water vapour is an extremely important absorber with in infrared part of the spectrum. The wavelengths at which EM radiations are partially or wholly transmitted through the atmosphere are known as ATMOSPHERIC WINDOWS and are used to acquire remote sensing data.
  • 112. REMOTE SENSING OBSERVATION PLATFORMS 1)Air Borne Platforms and (2) Space based platforms Air Borne Platforms (earlier days) Three types of aircrafts are currently used for Remote sensing operations: Dakota,AVRO and Beach-Craft Superking Air 200. Expensive and cannot provide cost and time effective solutions.
  • 113. What Are Satellites? ⚫Satellites are smaller objects traveling around larger objects ⚫Satellites may be man-made or natural, like the moon ⚫The two main types of satellites are polar-orbiting and geostationary ⚫Satellites are designed for three general purposes: science, applications, or communications Space based platforms
  • 114. Artificial Satellites Artificial Satellites are human-made space craft that are built and sent into space by people. These spacecraft can be crewed, such as the Space Shuttle, or uncrewed, such as NASA’s Hubble Space Telescope Hubble Space Telescope Communications Satellite NPOESS Satellite
  • 115. Polar-Orbiting Satellites Polar orbiting satellites travel in a circular pattern over the North and the South Poles, so they can look at large portions of the Earth as it turns below them. Polar-orbiting satellites are placed into a low-Earth orbit. They orbit at about 800 kilometers (500 miles) above the Earth. They travel at about 17,000 miles per hour N PO ES S Local Equatorial Crossing Time NPOESS
  • 116. Geostationary Satellites Geostationary satellites orbit the Earth at about 36000 Km above the equator. Seen from Earth, the satellite appears to be floating over a certain spot on the equator. They are primarily used for weather and communication
  • 117. Scientific Satellites ⚫ Most well-known type of satellite ⚫ Information from these satellites clarify the Earth’s history, present condition, and what the future may hold ⚫ Other scientific satellites look away from the Earth, studying the sun, stars, planets and other aspects of the universe
  • 118. Application/Weather Satellites ⚫ Application satellites are used to test and develop ways to improve global weather forecasting ⚫ These satellites are vital in predicting where and when tropical storms, hurricanes, floods, cyclones, tidal waves and forest fires may strike ⚫ The Television Infrared Observation Satellite (TIROS), launched in 1960, was the first of a series of meteorological satellites to carry television cameras to photograph the Earth’s cloud cover for research and forecasting
  • 119. Application/Weather Satellites ⚫ Later satellites, like the series of Nimbus satellites first launched in 1964, had infrared cameras as well. These satellites improved upon storm and hurricane forecasting and played a major role in the study of ozone depletion
  • 120. Communications Satellites ⚫First commercial satellites ⚫Aluminum-coated balloons were the first communications satellites ⚫The first commercially-launched satellite was Telestar 1, launched by AT&T in 1962. It transmitted photos and phone calls between America and Europe. This satellite was capable of 600 phone Communications satellites were the channels or one television channel ⚫Today, satellites like Intelsat provide up to 120,000 simultaneous two-way telephone circuits
  • 121. REMOTE SENSING SATELLITES Landsat Satellite Programme NationalAeronautical and SpaceAdministration(NASA) of USA planned the launching of a series of Earth Resources Technology Satellites (ERTS) and consequently ERTS-I was launched in July 1972. ERTS-I was later renamed as LANDSAT-1 and Five (5) Landsat satellites were launched till now.Extensively used for agriculture, civil engineering, forestry, geography, geology, land use planning, oceanography. SPOT Satellite Programme France,Sweden and Belgium jointly developed SYSTEM PORUL OBSERVATION DELA TERRE(SPOT). First satellite was launched in Feb 1988. Extensively used for Urban Planning, Urban growth assessment,Transportation planning etc.
  • 122. Indian Remote Sensing Satellites(IRS) Satellite for earth Observation(SEO-I) , now called as Bhaskara-I was the first Indian Remote Sensing satellite in 1979. IRS1A, IRS1B, IRS1C, IRS ID & IRS P4, RESOURCE SAT,CARTOSAT etc are few satellites launched by India. SENSORS Sensors are electronic instruments that receive EM radiation and generate an electric signal that correspond to the energy variation of different earth surface features. The signal can be recorded and displayed as numerical data or an image. A scanning system employs detectors with a narrow field of view which sweeps across the terrain to produce an image.
  • 123. Sensors on board of IRS 1)Linear Imaging and Self scanning sensor(LISS I & LISS II) :was onboard IRS 1A and 1B satellites. It had 4 bands operating in visible and near infrared region. 2)Linear Imaging and Self scanning sensor(LISS III): was onboard IRS 1C and 1D satellites. It has 3 bands in visible and near Infrared region and 1 band in short wave infra region. 3)Panchromatic Sensor(PAN): was onboard IRS 1C and 1D satellites. It has one band.
  • 124. 4)Wide Field Sensor(WiFS): was onboard IRS 1C and 1D satellites. It has two bands operating inVisible and Near Infra region. 5)Modular Opto-Electronic Scanner(MOS): was onboard IRS P3 satellite. 6)Ocean Colour Monitor(OCM): was onboard IRS P4 satellite. It has 8 spectral bands operating in visible and near IR region. 7)Multi Scanning Microwave Radiometer(MSMR): this is on board IRS 1D satellite.It is a passive microwave sensor.
  • 125. Types of remote sensing Passive: source of energy is either the Sun or Earth/atmosphere ◦ Sun -wavelengths: 0.4-5 µm ◦ Earth or its atmosphere -wavelengths: 3 µm -30 cm Active: source of energy is part of the remote sensor system ◦ Radar -wavelengths: mm-m ◦ Lidar -wavelengths: UV , Visible, and near infrared
  • 126. Spatial Resolution Spatial resolution and coverage  Instantaneous field-of-view (IFOV)  Pixel: smallest unit of an image  Pixel size Spatial coverage  Field of view (FOV), or  Area of coverage, such as MODIS: 2300km or global coverage, weather radar (NEXRAD): a circle with 230 km as radius
  • 127.
  • 128. 30 meter, spatial resolution 1 meter, spatial resolution
  • 129. Applications of Remote Sensing 1) Agriculture - Monitoring crop condition ,Identification of crops and their coverage estimation, Detection of moisture stress in crops, Desertification 2)Forestry - Improved forest type mapping ,Monitoring large scale deforestation, forest fires, Monitoring urban forestry ,Wild life habitat assessment 3)Land use and soils - Mapping land use land cover, Change detection, Soil Categorization 4)Urban Land Use - Transport studies ,Monitoring urban sprawl, Identification of unauthorized Structures, Navigation purposes. 5)Water resources Monitoring of surface water bodies ,Glacier inventory
  • 130. 6)Watershed - Delineation of watershed boundaries, Silting of water harvesting structures, major river valley projects. 7)Disasters - Mapping flood inundated area, damage assessment 8)Digital elevation models - Contour maps, Slope maps
  • 132.
  • 133.
  • 134.
  • 135. Evolution of satellite Remote Sensing in India • Following the successful demonstration flights of Bhaskara-1 and Bhaskara-2 - experimental Earth observation satellites developed and built by ISRO (Indian Space Research Organization) - and launched in 1979 and 1981, respectively, India began the development development of an indigenous indigenous IRS (Indian Remote Sensing Satellite) program. • India realized quite early that sustaining its space program in the long run would depend on indigenous technological capabilities (in particular, US export restrictions made this clear).
  • 136. • India under its different earth observation missions and programmes has launched varieties of satellites which have been proved to be an indispensable tool for natural resource mapping, monitoring ,management and planning including environmental assessment at global, regional and local levels. • The success of missions and developmental programmes programmes have been based on its judicious judicious scientific scientific approach of selecting multi- space platform, multiresolution, and synoptic viewing capabilities. • Keeping this in mind, besides building satellites, India embarked as well on satellite launch vehicle development in the early 1970s.
  • 137. • As a consequence, India has two very capable launch systems at the start of the 21st century, namely PSLV (Polar Satellite Launch Vehicle) and GSLV (Geosynchronous Satellite Launch Vehicle). • IRS is the integrated LEO (Low Earth Orbit) element of India's NNRMS (National Natural Resources Management System) with the objective objective to provide provide a long -term space borne operational capability to India for the observation and management of the country's natural resources (applications in agriculture, hydrology, geology, drought and flood monitoring, marine studies, snow studies, and land use).
  • 138. • The intend of the program is to create an environment of new perspectives for the Indian research community as a whole, to stimulate the development of new technologies and applications, and to utilize the Earth resources in More meaningful ways. • Note: The INSAT system is India's GEO (Geosynchronous Earth Orbit) element, providing for simultaneous domestic communications and earth observation functions.
  • 139. Satellite Launch Date Launch Vehicle Type of Satellite SARAL 25.02.2013 PSLV-C20 Earth Observation Satellite RISAT-1 26.04.2012 PSLV-C19 Earth Observation Satellite Jugnu 12.10.2011 PSLV-C18 Experimental / Small Satellite SRMSat 12.10.2011 PSLV-C18 Experimental / Small Satellite Megha-Tropiques 12.10.2011 PSLV-C18 Earth Observation Satellite GSAT-12 15.07.2011 PSLV-C17 Geo-Stationary Satellite GSAT-8 21.05.2011 Ariane-5 VA-202 Geo-Stationary Satellite RESOURCESAT-2 20.04.2011 PSLV-C16 Earth Observation Satellite YOUTHSAT 20.04.2011 PSLV-C16 Experimental / Small Satellite GSAT-5P 25.12.2010 GSLV-F06 Geo-Stationary Satellite STUDSAT 12.07.2010 PSLV-C15 Experimental / Small Satellite CARTOSAT-2B 12.07.2010 PSLV-C15 Earth Observation Satellite GSAT-4 15.04.2010 GSLV-D3 Geo-Stationary Satellite Oceansat-2 23.09.2009 PSLV-C14 Earth Observation Satellite ANUSAT 20.04.2009 PSLV-C12 Experimental / Small Satellite RISAT-2 20.04.2009 PSLV-C12 Earth Observation Satellite Chandrayaan-1 22.10.2008 PSLV-C11 Space Mission CARTOSAT - 2A 28.04.2008 PSLV-C9 Earth Observation Satellite IMS-1 28.04.2008 PSLV-C9 Earth Observation Satellite INSAT-4B 12.03.2007 Ariane-5ECA Geo-Stationary Satellite CARTOSAT - 2 10.01.2007 PSLV-C7 Earth Observation Satellite SRE - 1 10.01.2007 PSLV-C7 Experimental / Small Satellite INSAT-4CR 02.09.2007 GSLV-F04 Geo-Stationary Satellite Indian Remote Sensing Satellites
  • 140. INSAT-4CR 02.09.2007 GSLV-F04 Geo-Stationary Satellite INSAT-4C 10.07.2006 GSLV-F02 Geo-Stationary Satellite INSAT-4A 22.12.2005 Ariane-5GS Geo-Stationary Satellite HAMSAT 05.05.2005 PSLV-C6 Experimental / Small Satellite CARTOSAT-1 05.05.2005 PSLV-C6 Earth Observation Satellite EDUSAT (GSAT-3) 20.09.2004 GSLV-F01 Geo-Stationary Satellite Resourcesat-1(IRS-P6) 17.10.2003 PSLV-C5 Earth Observation Satellite INSAT-3A 10.04.2003 Ariane-5G Geo-Stationary Satellite INSAT-3E 28.09.2003 Ariane-5G Geo-Stationary Satellite
  • 141. INSAT-1B 30.08.1983 Shuttle [PAM-D] Geo-Stationary Satellite Rohini (RS-D2) 17.04.1983 SLV-3 Earth Observation Satellite INSAT-1A 10.04.1982 Delta 3910 PAM-D Geo-Stationary Satellite Bhaskara-II 20.11.1981 C-1 Intercosmos Earth Observation Satellite Ariane Passenger Payload Experiment (APPLE) 19.06.1981 Ariane-1(V-3) Geo-Stationary Satellite Rohini (RS-D1) 31.05.1981 SLV-3 Earth Observation Satellite Rohini (RS-1) 18.07.1980 SLV-3 Experimental / Small Satellite Rohini Technology Payload (RTP) 10.08.1979 SLV-3 Experimental / Small Satellite Bhaskara-I 07.06.1979 C-1 Intercosmos Earth Observation Satellite Aryabhata 19.04.1975 C-1 Intercosmos Experimental / Small Satellite
  • 142. Basic Principle Of Remote Sensing Objects and surfaces can be recognized and distinguished based on the radiant energy emitted/reflected by them. This principle underpins remote sensing, which detects and records the radiant energy for further study. Different objects and surfaces, like water, soil, or plants, return energy in different electromagnetic bands in different quantities. What factors actually impact this? Here are the main ones: properties of the object: chemical and physical composition, the kind of material, and surface roughness; properties of the radiation: the radiant intensity, incidence angle, and wavelength.
  • 143. Diverse disciplines come together in remote sensing, including optical science, photography, spectroscopy, electronics, computer science, telecommunications, satellite launch technology, and more.
  • 144. Elements Of Remote Sensing Incident radiation interacts with the objects of study and is captured by the sensor. A good example of this are the imaging systems, which typically entail the elements of remote sensing listed below. Yet, it’s important to remember that remote sensing also includes non-imaging sensor types and the detection of released energy. The following seven basic components form the backbone of remote sensing : Source of energy/illumination. A source of energy must be available either to illuminate the object of interest or to supply electromagnetic radiation to it. Radiation/energy and the atmosphere. Throughout its path from its origin to its destination, the radiation will interact with atmospheric particles. Once the energy passes from the object to the sensor, a second interaction may take place.
  • 145. Elements Of Remote Sensing Object of study. When the energy finally reaches its target, their interaction is determined by the characteristics of the radiation itself and the object. Radiation-recording sensor. A sensor that is physically apart from the object of study must pick up the electromagnetic radiation the target emits or reflects. Data processing facility. The sensor’s readings must be sent (usually electronically) to a data receiving and processing facility, where the measured energy is translated into a usable image. Analysis and interpretation. Following data processing in remote sensing, the image is analyzed and interpreted visually and/or digitally to acquire information about the object of study. Practical use. Lastly, the process involves putting the information we’ve learned from the images to good use to gain a deeper understanding of the target, uncover previously unknown facts, or aid in issue-solving.
  • 147.
  • 148.
  • 149.
  • 150.
  • 151.
  • 152.
  • 153. Atmospheric Interactions with Electromagnetic Radiation All electromagnetic radiation detected by a remote sensor has to pass through the atmosphere twice, before and after its interaction with earth's atmosphere. This passage will alter the speed, frequency, intensity, spectral distribution, and direction of the radiation. As a result atmospheric scattering and absorption occur (Curran, 1988). These effects are most severe in visible and infrared wavelengths, the range very crucial in remote sensing.
  • 154. During the transmission of energy through the atmosphere, light interacts with gases and particulate matter in a process called atmospheric scattering. The two major processes in scattering are selective scattering and non-selective scattering. Rayleigh, Mie and Raman scattering are of selective type. Non selective scattering is independent of wavelength. It is produced by particles whose radii exceed 10 Ilm, such as, water droplets and ice fragments present the clouds. This type of scattering reduces the contrast of the image. While passing through the atmosphere, electromagnetic radiation is scattered and absorbed by gasses and particulates.
  • 155. Besides the major gaseous components like molecular nitrogen and oxygen, other constituents like water vapour, methane, hydrogen, helium and nitrogen compounds play an important role in modifying the incident radiation and reflected radiation. This causes a reduction in the image contrast and introduces radiometric errors. Regions of the electromagnetic spectrum in which the atmosphere is transparent are called atmospheric windows.
  • 156.
  • 157. Energy interactions with Earth's surface materials When electromagnetic energy is incident on any feature of earth's surface, such as a water body, various fractions of energy get reflected, absorbed, and transmitted as shown in Fig. 2.13. Applying the principle of conservation of energy, the relationship can be expressed as:
  • 158.
  • 159. All energy components are functions of wavelength, (I). In remote sensing, the amount of reflected energy ER(A.) is more important than the absorbed and transmitted energies. Therefore, it is more convenient to rearrange these terms like Above equation is called balance equation
  • 160. From this mathematical equation, two important points can be drawn. Firstly, transmittance and can be denoted as p(A.), cx.(A.) and y(A.). Simply, it can be understood that, the measure of how much electromagnetic radiation is reflected off a surface is called its reflectance. The reflectance range lies between 0 and 1. A measure of 1.0 means that 100% of the incident radiation is reflected off the surface, and a measure '0' means that 0% is reflected. The reflectance characteristics are quantified by "spectral reflectance, p(A.) which is expressed as the following ratio: