Spatial resolution refers to the ability to distinguish between two close objects or fine detail in an image. It depends on properties of the imaging system, not just pixel count. Higher spatial resolution means finer details can be distinguished. Pixel count alone does not determine spatial resolution, as color images require interpolation between sensor pixels. Spatial resolution is measured differently for various media like film, digital cameras, microscopes, and more. It affects the ability to distinguish fine detail like gaps in a fence as distance increases.
Digital image processing focuses on two major tasks
-Improvement of pictorial information for human interpretation
-Processing of image data for storage, transmission and representation for autonomous machine perception
hyperspectral remote sensing and its geological applicationsabhijeet_banerjee
this is an introductory presentation on hyperspectral remote sensing, which essential deals with the distinguishing features, imaging spectrometers and its types, and some of the geological applications of hyperspectral remote sensing.
this presentation briefly describes the digital image processing and its various procedures and techniques which include image correction or rectification with remote sensing data/ images. it also contains various image classification techniques.
Digital image processing focuses on two major tasks
-Improvement of pictorial information for human interpretation
-Processing of image data for storage, transmission and representation for autonomous machine perception
hyperspectral remote sensing and its geological applicationsabhijeet_banerjee
this is an introductory presentation on hyperspectral remote sensing, which essential deals with the distinguishing features, imaging spectrometers and its types, and some of the geological applications of hyperspectral remote sensing.
this presentation briefly describes the digital image processing and its various procedures and techniques which include image correction or rectification with remote sensing data/ images. it also contains various image classification techniques.
it is highly useful for geography students in the field of remote sensing and it is in very simple and explanatory for the purpose of simplification with relevant images in this ppt.
MRI artifacts remains a big challenge to get a diagnostic image. This represents a practical comprehensive approach to understand MRI artifacts & how to get rid of.
it is highly useful for geography students in the field of remote sensing and it is in very simple and explanatory for the purpose of simplification with relevant images in this ppt.
MRI artifacts remains a big challenge to get a diagnostic image. This represents a practical comprehensive approach to understand MRI artifacts & how to get rid of.
A review of advances in Brachytherapy treatment planning and delivery in last decade or so, with main focus on brachytherapy for Prostate cancer, Breast cancer and Cervical cancer
The Indian Dental Academy is the Leader in continuing dental education , training dentists in all aspects of dentistry and
offering a wide range of dental certified courses in different formats.for more details please visit
www.indiandentalacademy.com
The Indian Dental Academy is the Leader in continuing dental education , training dentists in all aspects of dentistry and
offering a wide range of dental certified courses in different formats.for more details please visit
www.indiandentalacademy.com
The Indian Dental Academy is the Leader in continuing dental education , training dentists in all aspects of dentistry and
offering a wide range of dental certified courses in different formats.for more details please visit
www.indiandentalacademy.com
The Indian Dental Academy is the Leader in continuing dental education , training dentists in all aspects of dentistry and
offering a wide range of dental certified courses in different formats.for more details please visit
www.indiandentalacademy.com
The Indian Dental Academy is the Leader in continuing dental education , training dentists in all aspects of dentistry and
offering a wide range of dental certified courses in different formats.for more details please visit
www.indiandentalacademy.com
Recent advances digital imaging /certified fixed orthodontic courses by India...Indian dental academy
The Indian Dental Academy is the Leader in continuing dental education , training dentists in all aspects of dentistry and offering a wide range of dental certified courses in different formats.
Indian dental academy provides dental crown & Bridge,rotary endodontics,fixed orthodontics,
Dental implants courses.for details pls visit www.indiandentalacademy.com ,or call
0091-9248678078
Digital imaging /certified fixed orthodontic courses by Indian dental academy Indian dental academy
The Indian Dental Academy is the Leader in continuing dental education , training dentists in all aspects of dentistry and offering a wide range of dental certified courses in different formats.
Indian dental academy provides dental crown & Bridge,rotary endodontics,fixed orthodontics,
Dental implants courses.for details pls visit www.indiandentalacademy.com ,or call
0091-9248678078
http://www.ccsprojects.com/ – There are many digital cinema projection systems available, but which one will best serve the needs of your business and the expectations of your customers? This whitepaper from CCS Presentation Systems partner Christie Digital discusses the relevant technologies, their fundamental differences and show how they impact the factors that are most important for theatrical exhibition: image quality, functionality, reliability and cost of ownership.
Decentralized data fusion approach is one in which features are extracted and processed individually and finally fused to obtain global estimates. The paper presents decentralized data fusion algorithm using factor analysis model. Factor analysis is a statistical method used to study the effect and interdependence of various factors within a system. The proposed algorithm fuses accelerometer and gyroscope data in an inertial measurement unit (IMU). Simulations are carried out on Matlab platform to illustrate the algorithm.
We all have good and bad thoughts from time to time and situation to situation. We are bombarded daily with spiraling thoughts(both negative and positive) creating all-consuming feel , making us difficult to manage with associated suffering. Good thoughts are like our Mob Signal (Positive thought) amidst noise(negative thought) in the atmosphere. Negative thoughts like noise outweigh positive thoughts. These thoughts often create unwanted confusion, trouble, stress and frustration in our mind as well as chaos in our physical world. Negative thoughts are also known as “distorted thinking”.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
What is spatial Resolution
1. What is Spatial Resolution ?
A presentation for better understanding!
S.A.Quadri
CEDEC , USM , Malaysia
2. Effect of Spatial resolution on visualization
(Satellite image : Reference http://visibleearth.nasa.gov/view_rec.php?id=1427)
3.
4. Image resolution
It is an umbrella term that describes the detail an image holds.
The term applies to raster digital images, film images, and other types of images.
Higher resolution means more image details.
Image resolution can be measured in various ways.
Resolution quantifies how close lines can be to each other and still be visibly resolved.
Resolution units can be tied to physical sizes (e.g. lines per mm, lines per inch), to the
overall size of a picture (lines per picture height, also known simply as lines, TV lines, or
TVL), or to angular subtenant.
Line pairs are often used instead of lines.
A line pair comprises a dark line and an adjacent light line.
A line is either a dark line or a light line.
A resolution of 10 lines per millimeter means 5 dark lines alternating with 5 light lines, or 5
line pairs per millimeter (5 LP/mm).
Photographic lens and film resolution are most often quoted in line pairs per millimeter.
5. Resolution of digital images
The resolution of digital images can be described in many different ways.
The term resolution is often used for a pixel count in digital imaging, even though American, Japanese, &
international standards specify that it should not be so used, at least in the digital camera field.
•An image of N pixels high by M pixels wide can have any resolution less than N lines per picture height, or N TV
lines. But when the pixel counts are referred to as resolution, the convention is to describe the pixel resolution with
the set of two positive integer numbers, where the first number is the number of pixel columns (width) and the
second is the number of pixel rows (height), for example as 640 by 480.
•Another popular convention is to cite resolution as the total number of pixels in the image, typically given as
number of megapixels, which can be calculated by multiplying pixel columns by pixel rows and dividing by one
million.
•Other conventions include describing pixels per length unit or pixels per area unit, such as pixels per inch or per
square inch.
•According to the same standards, the number of effective pixels that an image sensor or digital camera has is the
count of elementary pixel sensors that contribute to the final image, as opposed to the number of total pixels,
which includes unused or light-shielded pixels around the edges.
None of these pixel resolutions are true resolutions, but they are widely referred to as such;
they serve as upper bounds on image resolution.
6. Effect of pixel resolutions
Below is an illustration of how the same image might appear at different pixel
resolutions, if the pixels were poorly rendered as sharp squares (normally, a
smooth image reconstruction from pixels would be preferred, but for illustration of
pixels, the sharp squares make the point better).
7. Further Explanation
An image that is 2048 pixels in width and 1536 pixels in height has a total of 2048×1536 = 3,145,728
pixels.
One could refer to it as 2048 by 1536 or a 3.1-megapixel image.
Unfortunately, the count of pixels is not a real measure of the resolution of digital camera
images, because :
Color image sensors are typically set up to alternate color filter types over the light sensitive individual
pixel sensors.
Digital images ultimately require a red, green, and blue value for each pixel to be displayed or printed,
but one individual pixel in the image sensor will only supply one of those three pieces of information.
The image has to be interpolated or demosaiced to produce all three colors for each output pixel.
8. Spatial resolution
The measure of how closely lines can be resolved in an image is called spatial resolution, and it depends on properties of the system
creating the image, not just the pixel resolution in pixels per inch (ppi).
For practical purposes the clarity of the image is decided by its spatial resolution, not the number of pixels in an image.
In effect, spatial resolution refers to the number of independent pixel values per unit length.
•The spatial resolution of computer monitors is generally 72 to 100 lines per inch, corresponding to pixel resolutions of 72 to 100 ppi.
•With scanners, optical resolution is used to distinguish spatial resolution from the number of pixels per inch.
•In geographic information systems (GISs), spatial resolution is measured by the ground sample distance (GSD) of an image, the pixel
spacing on the Earth's surface.
•In astronomy one often measures spatial resolution in data points per arc second subtended at the point of observation, since the physical
distance between objects in the image depends on their distance away & this varies widely with the object of interest.
•In electron microscopy, line or fringe resolution refers to the minimum separation detectable between adjacent parallel lines (e.g.
between planes of atoms), while point resolution instead refers to the minimum separation between adjacent points that can be both
detected & interpreted e.g. as adjacent columns of atoms, for instance.
•In Stereoscopic 3D images, spatial resolution could be defined as the spatial information recorded or captured by two viewpoints of a
stereo camera (left & right camera).
It could be argued that such "spatial resolution" could add an image that then would not depend solely on pixel count or Dots per inch
alone, when classifying and interpreting overall resolution of a given photographic image or video frame.
9. Spatial resolution and Pixel count
Just make out difference !
Spatial resolution Pixel count
10. Spectral resolution
Color images distinguish light of different spectra.
Multi-spectral images resolve even finer differences of spectrum or wavelength than is needed to reproduce
color. That is, they can have higher spectral resolution. i.e. (high strength of each band).
Temporal resolution
Movie cameras and high-speed cameras can resolve events at different points in time.
The time resolution used for movies is usually 15 to 30 frames per second (frames/s),
while high-speed cameras may resolve 100 to 1000 frames/s, or even more.
Radiometric resolution
Radiometric resolution determines how finely a system can represent or distinguish differences of intensity,
and is usually expressed as a number of levels or a number of bits, for example, 8 bits or 256 levels that is
typical of computer image files.
The higher the radiometric resolution, the better subtle differences of intensity or reflectivity can be
represented, at least in theory.
In practice, the effective radiometric resolution is typically limited by the noise level, rather than by the
number of bits of representation.
11. Resolution in various media
This is a list of resolutions for various media.
Analog and early digital
352×240 : Video CD
300×480 : Umatic, Betamax, VHS, Video8
350×480 : Super Betamax, Betacam
420×480 : LaserDisc, Super VHS, Hi8
640×480 : Analog broadcast (NTSC)
670×480 : Enhanced Definition Betamax
768×576 : Analog broadcast (PAL, SECAM)
Digital
720×480 : D-VHS, DVD, miniDV, Digital8, Digital Betacam
720×480 : Widescreen DVD (anamorphic)
1280×720 : D-VHS, HD DVD, Blue-ray, HDV (miniDV)
1440×1080 : HDV (miniDV)
1920×1080 : HDV (miniDV), AVCHD, HD DVD, Blu-ray, HDCAM SR
2048×1080 : 2K Digital Cinema
4096×2160 : 4K Digital Cinema
7680×4320 : UHDTV
Film
35 mm film is scanned for release on DVD at 1080 or 2000 lines as of 2005.
However some photography sources gives 5380 x 3620 as the resolution of 35mm film.
It is similar to 19.5 Mpix, of course with identical spatial resolution.
IMAX, including IMAX HD and OMNIMAX: approximately 10,000×7000 (7000 lines) resolution.
It is about 70 Mpix, which may be considered to the biggest resolution.
12. Spatial Resolution and Pixel Size
The image resolution and pixel size are often used interchangeably.
In reality, they are not equivalent. An image sampled at a small pixel size does not necessarily has a high resolution.
The following three images illustrate this point. The first image is a SPOT image of 10 m pixel size.
It was derived by merging a SPOT panchromatic image of 10 m resolution with a SPOT multispectral image of 20 m
resolution.
The effective resolution is thus determined by the resolution of the panchromatic image, which is 10 m.
This image is further processed to degrade the resolution while maintaining the same pixel size.
The next two images are the blurred versions of the image with larger resolution size, but still digitized at the same
pixel size of 10 m.
Even though they have the same pixel size as the first image, they do not have the same resolution
13. RESOLUTION AND SHARPNESS
To determine resolution, a raster is normally used, employing increasingly fine bars and gaps. A common example in
real images would be a picket fence displayed to perspective.
In the image of the fence, shown in Fig. 1, it is evident that the gaps between the boards become increasingly difficult
to discriminate as the distance becomes greater.
This effect is the basic problem of every optical image.
In the foreground of the image, where the boards and gaps have not yet been squeezed together by the perspective, a
large difference in brightness is recognized.
The more the boards and gaps are squeezed together in the distance, the less difference is seen in the brightness.
To better understand this effect, the brightness values are shown along the yellow arrow in an x / y diagram (Fig. 2).
The brightness difference seen in the y-axis is called contrast.
The curve itself functions like a harmonic oscillation; because the brightness does not change over time but spatially
from left to right, the x-axis is called spatial frequency.
14. It can be clearly seen in Fig. 1 that the finer the reproduced structure, the more the contrast
will be “slurred” at that point in the image.
The limit of the resolution has been reached when one can no longer clearly differentiate
between the structures.
This means the resolution limit (red circle indicated in Fig. 2) lies at the spatial frequency
where there is just enough contrast left to clearly differentiate between board and gap.
15. Resolution = Sharpness?
Are resolution and sharpness the same? By looking at the images shown below, one can quickly determine which image
is sharper.
Although the image on the left comprises twice as many pixels, the image on the right, whose contrast at coarse details
is increased with a filter, looks at first glance to be distinctly sharper.
The resolution limit describes how much information makes up each image, but not how a person evaluates this
information.
The human eye, in fact, is able to resolve extremely fine details.
This ability is also valid for objects at a greater distance.
The decisive physiological point, however, is that fine details do not contribute to the subjective perception of
sharpness.
Therefore, it’s important to clearly separate the two terms, resolution and sharpness.
16. MTF
Modulation transfer function describes the relationship between resolution and sharpness, and is the basis for a
scientific confirmation of the phenomenon described earlier.
The modulation component in MTF means approximately the same as contrast.
If we evaluate the contrast (modulation) not only where the resolution reaches its limit, but over as many spatial
frequencies as possible and connect these points with a curve, we arrive at the so-called MTF.
As shown in figure , the x-axis illustrates the already-established spatial frequency expressed in lp/ mm on the y-axis,
instead of the brightness seen in modulation.
A modulation of 1 (or 100%) is the ratio of the brightness of a completely white image to the brightness of a
completely black image.
The higher the spatial frequency— in other words the finer the structures in the image - the lower the transferred
modulation. (lp= lines pair )
Conclusions:
•Sharpness does not depend only on resolution.
• The modulation at lower spatial frequencies is
essential.
•Contrast in coarse details is significantly more imp for
the impression of sharpness than contrast at the
resolution limit.
17. Resolution of the human eye
The fovea of the human eye (the part of the retina that is responsible for sharp central vision) includes
about 140 000 sensor-cells per square millimeter.
This means that if two objects are projected with a separation distance of more than 4 m on the fovea,
a human with a normal visual acuity (20/20) can resolve them.
On the object side, this corresponds to 0.2 mm in a distance of 1 m (or 1 minute of arc).
In practice of course, this depends on whether the viewer is concentrating only on the center of the
viewing field, whether the object is moving very slowly or not at all, and whether the object has good
contrast to the background. Allowing for some amount of tolerance, this would be around 0.3 mm at 1
m distance (= 1.03 minutes of arc ). In a certain range, one can assume a linear relation between
distance and the detail size
18. This hypothesis can be easily proved !!!
Pin the test pattern displayed in Figure below on a well-lit wall and walk away 10 m.
One should be able to clearly differentiate between the lines and gaps in Figure.
Of course, this requires an ideal visual acuity of 20/20.
Nevertheless, if you can’t resolve the pattern in Figure,
you might consider paying a visit to an ophthalmologist !
19. How we interpret optical images ?
Let us see significance of spatial resolution and various other related terms:
Four main types of information contained in an optical image are often utilized for
image interpretation:
•Radiometric Information (i.e. brightness, intensity, tone),
•Spectral Information (i.e. color, hue),
•Textural Information,
•Geometric and Contextual Information.
They are illustrated in the following examples,
20. There are different types of images :
•Panchromatic Images
•Multispectral Images
•Color Composite Images
•True Color Composite images
•False Color Composite images
•Natural Color Composite
21. Panchromatic image
A panchromatic image consists of only one band.
It is usually displayed as a grey scale image.
Panchromatic image may be similarly interpreted as a black-and-white aerial photograph of the area.
The Radiometric Information is the main information type utilized in the interpretation.
A panchromatic image extracted from a SPOT panchromatic scene at a ground resolution of 10 m.
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and
http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
22. Multispectral Images
A multispectral image consists of several bands of data.
For visual display, each band of the image may be displayed one band at a time as a grey scale image, or in combination of 3 bands at a time
as a color composite image.
Interpretation of a multispectral color composite image will require the knowledge of the spectral reflectance signature of the targets in the
scene.
In this case, the spectral information content of the image is utilized in the interpretation.
The following 3 images show the 3 bands of a multispectral image extracted from a SPOT multispectral scene at a ground resolution of 20 m.
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
23. Color Composite Images
In displaying a color composite image, three primary colors (red, green and blue) are used.
When these three colors are combined in various proportions, they produce different colors in the visible
spectrum.
Associating each spectral band (not necessarily a visible band) to a separate primary color results in a
color composite image.
24. True Color Composite
If a multispectral image consists of the three visual primary color bands (red, green, blue), the three bands may be
combined to produce a "true color" image.
The bands 3 (red band), 2 (green band) and 1 (blue band) of a LANDSAT TM image or an IKONOS multispectral
image can be assigned respectively to the R, G, and B colors for display.
In this way, the colors of the resulting color composite image resemble closely what would be observed by the human
eyes.
A 1-m resolution true-color IKONOS image
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
25. False Color Composite
The display color assignment for any band of a multispectral image can be done in an entirely arbitrary manner.
In this case, the color of a target in the displayed image does not have any resemblance to its actual colour.
The resulting product is known as a false colour composite image.
There are many possible schemes of producing false colour composite images.
Some schemes are suitable for detecting certain objects in the image.
False colour composite multispectral SPOT image
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
26. Natural Colour Composite
For optical images lacking one or more of the three visual primary colour bands (i.e. red, green and blue), the spectral
bands (some of which may not be in the visible region) may be combined in such a way that the appearance of the
displayed image resembles a visible colour photograph, i.e. vegetation in green, water in blue, soil in brown or grey, etc.
Some people refer to this composite as a "true colour" composite. However, this term is misleading since in many
instances the colors are only simulated to look similar to the "true" colors of the targets. The term "natural colour" is
preferred.
Natural colour composite multispectral SPOT image
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
27. Vegetation Indices
Different bands of a multispectral image may be combined to accentuate the vegetated areas.
One such combination is the ratio of the near-infrared band to the red band. This ratio is known as the
Ratio Vegetation Index (RVI)
RVI = NIR/Red
Normalized Difference Vegetation Index (NDVI)
Since vegetation has high NIR reflectance but low red reflectance, vegetated areas will have higher RVI
values compared to non-vegetated areas. Another commonly used vegetation index is the Normalized
Difference Vegetation Index (NDVI) computed by
NDVI = (NIR - Red)/(NIR + Red)
28. Textural Information
Texture is an important aid in visual image interpretation, especially for high spatial resolution imagery.
It is also possible to characterize the textural features numerically, and algorithms for computer-aided
automatic discrimination of different textures in an image are available.
IKONOS 1-m resolution pan-sharpened color image of an oil palm plantation.
Even though the general colour is green throughout, three distinct, land cover types can be identified from the
image texture.
(Reference: http://visibleearth.nasa.gov/view_rec.php?id=1427 and http://earthobservatory.nasa.gov/Contact/index_ve.php?do=s)
29. Remote Sensing Satellites
Optical remote sensing makes use of visible, near infrared & short-wave infrared sensors to form images of the earth's surface .
By detecting the solar radiation reflected from targets on the ground.
Different materials reflect and absorb differently at different wavelengths.
Thus, the targets can be differentiated by their spectral reflectance signatures in the remotely sensed images.
Optical remote sensing systems are classified into the following types, depending on the number of spectral bands used in the imaging
process.
Several remote sensing satellites are currently available, providing imagery suitable for various types of applications.
Each of these satellite-sensor platform is characterized by the
•Wavelength bands employed in image acquisition,
•Spatial resolution of the sensor,
•The coverage area and the temporal coverage, i.e. how frequent a given location on the earth surface can be imaged by the
imaging system.
30. In terms of the spatial resolution, the satellite imaging systems can be classified into:
•Low resolution systems (approx. 1 km or more)
•Medium resolution systems (approx. 100 m to 1 km)
•High resolution systems (approx. 5 m to 100 m)
•Very high resolution systems (approx. 5 m or less)
In terms of the spectral regions used in data acquisition, the satellite imaging systems can be classified into:
•Optical imaging systems (include visible, near infrared, and shortwave infrared systems)
•Thermal imaging systems
•Synthetic aperture radar (SAR) imaging systems
Optical/thermal imaging systems can be classified according to the number of spectral bands used:
•Monospectral or panchromatic (single wavelength band, "black-and-white", grey-scale image) systems
•Multispectral (several spectral bands) systems
•Superspectral (tens of spectral bands) systems
•Hyper spectral (hundreds of spectral bands) systems
Synthetic aperture radar imaging systems can be classified according to the combination of frequency bands &
polarization modes used in data acquisition, e.g.:
•Single frequency (L-band, or C-band, or X-band)
•Multiple frequency (Combination of two or more frequency bands)
•Single polarization (VV, or HH, or HV)
•Multiple polarization (Combination of two or more polarization modes)