Geospatial Information Systems and Science Forum
Intermediate Workshop on
Grid-based Map Analysis Techniques and GIS Modeling
8:30 to 10:30 am – November 19, 2004
Joseph K. Berry
Keck Scholar in Geosciences, Geography Department, University of Denver
Principal, Berry & Associates // Spatial Information Systems
Email email@example.com — Web www.innovativegis.com/basis/
Situation: Most desktop mapping and GIS applications have focused on
mapping and spatial data management responding to inventory assessments of
"Where Is What" involving digital maps and linked databases (computer
mapping and geo-query). GIS modeling provides new analytical tools and
processing structures for incorporating spatial relationships that address "Why
and So What" in both research and decision-making contexts (map analysis
and modeling). The extension of the geosciences from descriptive to
prescriptive mapping involves new spatial reasoning concepts and skills that are not reflected in
our paper map legacy and manual processing procedures.
Description: This intermediate level workshop discusses and demonstrates several techniques for
spatial analysis and data mining using numerous examples from natural resources management,
geo-business and precision agriculture. The discussion focuses on concepts, procedures and
practical considerations in successfully applying grid-based map analysis in GIS modeling. The
material presented encapsulates numerous “Beyond mapping” columns by the instructor published
in GeoWorld magazine and compiled into the online book Map Analysis
(www.innovativegis.com/vbasis/, select Map Analysis book)
Who Should Attend: Faculty and students who are interested in or currently involved in the
development of systems that analyze spatial data should attend. Prior GIS exposure and
familiarity with basic statistical concepts are recommended.
About the Instructor: Dr. Joseph K. Berry is the Principal of Berry and
Associates // Spatial Information Systems, consultants and software developers
in Geographic Information Systems (GIS) technology. He also serves as the
Keck Scholar in Geosciences at the University of Denver and is a Special
Faculty member at Colorado State University. Prior to these assignments he
was an Associate Professor and the Associate Dean at Yale University’s School of Forestry and
Environmental Science. He is the author of the “Beyond Mapping” column for GeoWorld and has
written over two hundred papers on the analytic capabilities of GIS. He is the author of the
popular books Beyond Mapping and Spatial Reasoning and is the developer of the MapCalc
Learner materials for instruction of grid-based map analysis. Dr. Berry holds a B.S. degree in
Forestry, a M.S. degree in business management, and a Ph.D. emphasizing remote sensing and
land use planning.
General Notes for Intermediate Workshop on
Grid-based Map Analysis Techniques and GIS Modeling
Joseph K. Berry, email firstname.lastname@example.org, website www.innovativegis.com/basis
MapCalc Applications at www.innovativegis.com/basis/Senarios/default.html
Map Analysis book at www.innovativegis.com/basis/MapAnalysis/default.html
Cartography– manual map drafting (paper map legacy for thousands of years)
Computer Mapping– automates the cartographic process (70s)
Spatial Database Management– links computer mapping techniques with traditional database
GIS Modeling– representation of relationships within and among mapped data (90s)…
Surface Modeling– maps the spatial distribution of a set of point sampled data,
Spatial Data Mining– characterizes the “numerical” relationships among mapped data and
develops predictive models,
Spatial Analysis– derives new information based on “contextual” relationships among
mapped data, and
GIS Modeling– logical processing of spatial information to characterize a system or solve a
(See Map Analysis book, “Topic 4” for more information)
Raster refers to image display (map values represent the color assigned to each dot; e.g., scanned
topographic maps–DRGs or aerial photos–DOQs) while Grid refers to map analysis (map values
have all of the rights, privileges and responsibilities of a map-ematics).
Grid Data Structure the Analysis Frame provides consistent “parceling” needed for map analysis
and extends points, lines and areas to Map Surfaces.
(See MapCalc Applications, “Short Video Demos” for more information)
Shading Manager options include # of Ranges, Calculation Method (e.g., Equal Ranges with
same range for each interval and Equal Count with same number of cells for each interval) and
Color Pallet/Ramp selection.
Grid Display Types are Lattice that forms a smooth “wireframe” by connecting cell centroids
with lines whose lengths are a function of elevation differences and Grid that forms extruded grids
whose heights are a function of elevation differences.
(See MapCalc Applications, “Display Types” for more information)
Grid Data Types are characterized by their Numeric Distribution (independent integers versus
range of values) and their Geographic Distribution (abrupt boundaries versus gradient). A
Discrete map has values that simply represent categories (e.g., a covertype map) that form sharp
abrupt boundaries) whereas a Continuous map has values that represent a spatial gradient (e.g., a
(See MapCalc Applications, “Data Types” for more information)
Spatial Analysis investigates the “contextual” relationships in mapped data…
Reclassifying Maps– New map values are a function of the values on a single existing
map… no new spatial information is created,
Overlaying Maps– New map values are a function of the values on two or more existing
maps… new spatial information is created,
Measuring Distance– New map values are a function of the simple or weighted distance
or connectivity among map features, and
Summarizing Neighbors– New map values are a function of the values within the vicinity
of a location on an existing map.
There are three basic types of Spatial Models…
Statistical Models– based on numerical relationships (e.g., crop yield),
Process Models– based on physical (e.g., erosion potential), and
Suitability Models– based on logically sequenced decision criteria similar to a recipe (e.g,
• Binary Model– identifies areas that are acceptable based on combining binary maps (0
• Ranking Model– develops a ranking of areas based on the number of criteria that are
acceptable (0 to 3), and
• Rating Model– develops a “goodness” scale (0 to 9 best) and calculates the average rating
for each grid cell.
(See Map Analysis book, “Topic 23” for more information)
Measuring Distance– the concept of Distance as the “shortest straight line between two points” is
expanded to Proximity by relaxing the assumption of only “two points” then expanded to
Movement by relaxing the assumption of “straight-line” connectivity.
(See Map Analysis book, “Topic 13” and “Topic 14” for more information)
(See MapCalc Applications, “Determining Proximity” and “Creating an Up-Hill Road Buffer”)
Calculating Visual Exposure– a Viewshed identifies all locations that can be seen from a view
point(s) while Visual Exposure develops a relative scale indicating the number of times each
location is seen from a set of viewer points *(e.g., a road network).
(See Map Analysis book, “Topic 15” for more information)
(See MapCalc Applications, “Determining Visual Exposure” and “Modeling Visual Exposure)
Summarizing Neighbors– a Diversity Map indicates how many different types, a Roughness
Map identifies the variation in slope values, and a Density Map reports the total value within a
specified distance of each grid location.
(See MapCalc Applications, “Assessing Covertype Diversity”)
Surface Modeling maps the spatial distribution and pattern of point data…
Map Generalization– characterizes spatial trends (e.g., titled plane) by considering all of the
samples at once as it fits a surface,
Spatial Interpolation– derives spatial distributions (e.g., IDW, Krig) by considering small,
localized set of samples throughout the map area (roving window), and
Other– roving window/facets (e.g., density surface; tessellation)
(See Map Analysis book, “Topic 2” for more information)
Data Mining investigates the “numerical” relationships in mapped data…
Descriptive– calculates aggregate statistics (e.g., average/stdev, similarity, clustering) that
summarize mapped data,
Predictive– develops relationships among maps (e.g., regression) that can be used to forecast
characteristics or conditions at other locations or times, and
Prescriptive– uses descriptive and predictive information to optimize appropriate actions.
(See Map Analysis book, “Topic 16” for more information)
Data Conversion investigates vector to/from raster data exchange…
V to R– burning the points, lines and areas into the grid (fat, thin and split),
R to V– connecting grid centroids, sides and edges (line smoothing), and
Pseudo Grid– each grid cell is stored as a polygon
(See Map Analysis book, “Topic 18” for more information)
MapCalc LearnerTM – Student Tutorial Version (CD) with
MapCalc and Surfer Tutorial systems, Exercises/databases, application demos and text; 100x100
configuration; single seat license for educational use only; US$21.95 plus shipping.
A BRIEF HISTORY AND PROBABLE FUTURE
OF GIS IN NATURAL RESOURCES
Joseph K. Berry
Berry and Associates // Spatial Information Systems, Inc.
2000 South College Avenue, Suite 300, Fort Collins, Colorado USA 80525
Natural resources management is inherently a spatial endeavor. Its data are particularly complex
as they require two descriptors, namely the precise location of what is being described, and a clear
description of its physical characteristics. For hundreds of years the link between the two
descriptors has been the traditional, manually drafted map involving pens, rub-on shading, rulers,
planimeters, dot grids, and acetate sheets.
More recently, analysis of mapped data has become an important part of resource planning. This
new perspective marks a turning point in the use of maps-- from one emphasizing physical
descriptions of geographic space, to one of interpreting mapped data, and finally, to spatially
communicating decision factors. This movement from "description to prescription" has set the
stage for entirely new concepts in natural resources planning and management.
Since the 1960's, the decision-making process has become increasingly quantitative, and
mathematical models have become commonplace. Prior to the computerized map, most spatial
analyses were severely limited by their manual processing procedures. The computer has
provided the means for both efficient handling of voluminous data and effective spatial analysis
capabilities. From this perspective, all geographic information systems are rooted in the digital
nature of the computerized map.
COMPUTER MAPPING (Automated Cartography)
The early 1970's saw computer mapping automate map drafting. The points, lines and areas
defining geographic features on a map are represented as an organized set of X, Y coordinates.
These data drive pen plotters that can rapidly redraw the connections at a variety of colors, scales,
and projections. The map image, itself, is the focus of this processing.
The pioneering work during this period established many of the underlying concepts and
procedures of modern GIS technology. An obvious advantage with computer mapping is the
ability to change a portion of a map and quickly redraft the entire area. Updates to resource maps
which could take weeks, such as a forest fire burn, can be done in a few hours. The less obvious
advantage is the radical change in the format of mapped data— from analog inked lines on paper,
to digital values stored on disk.
SPATIAL DATA MANAGEMENT (Descriptive Analysis)
During the early 1980's, the change in data format and computer environment was exploited.
Spatial database management systems were developed that linked computer mapping capabilities
with traditional database management capabilities. In these systems, identification numbers are
assigned to each geographic feature, such as a timber harvest unit or wildlife management parcel.
For example, a user is able to point to any location on a map and instantly retrieve information
about that location. Alternatively, a user can specify a set of conditions, such as a specific forest
and soil combination, and direct the result of the geographic search to be displayed as a map.
During the early development of GIS, two alternative data structures for encoding maps were
debated. The vector data model closely mimics the manual drafting process by representing map
features as a set of lines which, in turn, are stores as a series of X,Y coordinates. An alternative
structure, termed the raster data model, establishes an imaginary grid over a project area, and then
stores resource information for each cell in the grid. The early debate attempted to determine the
universally best structure. The relative advantages and disadvantages of both were viewed in a
competitive manner that failed to recognize the overall strengths of a GIS approach encompassing
By the mid-1980's, the general consensus within the GIS community was that the nature of the
data and the processing desired determines the appropriate data structure. This realization of the
duality of mapped data structure had significant impact on geographic information systems. From
one perspective, resource maps form sharp boundaries that are best represented as lines. Property
ownership, timber sale boundaries, and haul road networks are examples where lines are real and
the data are certain. Other maps, such as soils, site index, and slope are interpretations of terrain
conditions. The placement of lines identifying these conditions is subject to judgment, statistical
analysis of field data, and broad classification of continuous spatial distributions. From this
perspective, the sharp boundary implied by a line is artificial and the data itself is based on
Increasing demands for mapped data focused attention on data availability, accuracy and
standards, as well as data structure issues. Hardware vendors continued to improve digitizing
equipment, with manual digitizing tablets giving way to automated scanners at many GIS
facilities. A new industry for map encoding and database design emerged, as well as a
marketplace for the sales of digital map products. Regional, national and international
organizations began addressing the necessary standards for digital maps to insure compatibility
among systems. This era saw GIS database development move from project costing to equity
investment justification in the development of corporate databases.
SPATIAL ANALYSIS AND MODELING (Prescriptive Analysis)
As GIS continued its evolution, the emphasis turned from descriptive query to prescriptive
analysis of maps. For the greater part, early GIS systems concentrated on automating traditional
mapping practices. If a resource manager had to repeatedly overlay several maps on a light-table,
an analogous procedure was developed within the GIS. Similarly, if repeated distance and bearing
calculations were needed, the GIS system was programmed with a mathematical solution. The
result of this effort was GIS functionality that mimicked the manual procedures in a resource
manager's daily activities. The value of these systems was the savings gained by automating
tedious and repetitive operations.
By the mid-1980's, the bulk of descriptive query operations were available in most GIS systems.
A comprehensive theory of spatial analysis began to emerge. The dominant feature of this theory
is that spatial information is represented numerically, rather than in analog fashion as inked lines
on a map. These digital maps are frequently conceptualized as a set of "floating maps" with a
common registration, allowing the computer to "look" down and across the stack of digital maps.
The spatial relationships of the data can be summarized (database queries) or mathematically
manipulated (analytic processing). Because of the analog nature of traditional map sheets, manual
analytic techniques are limited in their quantitative processing. Digital representation, on the other
hand, makes a wealth of quantitative (as well as qualitative) processing possible. The application
of this new theory to natural resources is revolutionary. Its application takes two forms-- spatial
statistics and spatial modeling.
Geophysicists have used spatial statistics for many years to characterizing the geographic
distribution, or pattern, of mapped data. The statistics describe the spatial variation in the data,
rather than assuming a typical response is everywhere. For example, field measurements of snow
depth can be made at several plots within a watershed. Traditionally, these data are analyzed for a
single value (the average depth) to characterize the watershed. Spatial statistics, on the other hand,
uses both the location and the measurements at the plots to generate a map of relative snow depth
throughout the entire watershed.
The full impact of this map-ematical treatment of maps is yet to be determined. The application of
such concepts as spatial correlation, statistical filters, map uncertainty and error propagation await
their translation from other fields.
Spatial analysis, on the other hand, has a rapidly growing number of current resource applications.
For example, natural resources managers can characterize timber supply by considering the
relative skidding and log-hauling accessibility of harvesting parcels. Wildlife managers can
consider such factors as proximity to roads and relative housing density to map human activity and
incorporate this information into habitat delineation. Natural resources planners can assess the
visual exposure of alternative sites for a facility to sensitive viewing locations, such as roads and
Spatial mathematics has evolved similar to spatial statistics by extending conventional concepts.
This "map algebra" uses sequential processing of spatial operators to perform complex map
analyses. It is similar to traditional algebra in which primitive operations (e.g., add, subtract,
exponentiate) are logically sequenced on variables to form equations. However in map algebra,
entire maps composed of thousands or millions of numbers represent the variables of the spatial
Most of the traditional mathematical capabilities, plus an extensive set of advanced map
processing operations, are available in modern GIS packages. You can add, subtract, multiply,
divide, exponentiate, root, log, cosine, differentiate and even integrate maps. After all, maps in a
GIS are just an organized set of numbers. However, with map-ematics, the spatial coincidence
and juxta-positioning of values among and within maps create new operations, such as effective
distance, optimal path routing, visual exposure density and landscape diversity, shape and pattern.
These new tools and modeling approach to natural resource information combine to extend record-
keeping systems and decision-making models into effective decision support systems.
SPATIAL REASONING (Communicating Perceptions)
The turn of the century will build on the cognitive basis, as well as the databases, of current
geographic information systems. GIS is at a threshold that is pushing beyond mapping,
management, and modeling, to spatial reasoning and dialogue. In the past, analysis models have
focused on management options that are technically optimal-- the scientific solution. Yet in
reality, there is another set of perspectives that must be considered-- the social solution. It is this
final sieve of management alternatives that most often confounds resource decision-making. It
uses elusive measures, such as human values, attitudes, beliefs, judgment, trust and understanding.
These are not the usual quantitative measures amenable to computer algorithms and traditional
The step from technically feasible to socially acceptable options is not so much increased
scientific and econometric modeling, as it is communication. Basic to effective communication is
involvement of interested parties throughout the decision-making process. This new participatory
environment has two main elements-- consensus building and conflict resolution. Consensus
building involves technically driven communication and occurs during the alternative formulation
phase. It involves the resource specialist's translation of the various considerations raised by a
decision team into a spatial model. Once completed, the model is executed under a wide variety
of conditions and the differences in outcome are noted.
From this perspective, a single map of a forest plan is not the objective. It is how maps change as
the different scenarios are tried that becomes information. "What if avoidance of visual exposure
is more important than avoidance of steep slopes in siting a new haul road? Where does the
proposed route change, if at all?" Answers to these analytic queries focus attention on the effects
of differing perspectives. Often, seemingly divergent philosophical views result in only slightly
different map views. This realization, coupled with active involvement in the decision-making
process, often leads to group consensus.
If consensus is not obtained, conflict resolution is necessary. This socially-driven communication
occurs during the decision formulation phase. It involves the creation of a "conflicts map" which
compares the outcomes from two or more competing uses. Each management parcel is assigned a
numeric code describing the actual conflict over the location. A parcel might be identified as ideal
for a wildlife preserve, a campground and a timber harvest. As these alternatives are mutually
exclusive, a single use must be assigned. The assignment, however, involves a holistic perspective
that simultaneously considers the assignments of all other locations in a project area.
Traditional scientific approaches are rarely effective in addressing the holistic problem of conflict
resolution. Most are linear models, requiring a succession, or cascade, of individual parcel
assignments. The final result is strongly biased by the ordering of parcel consideration. Even if a
scientific solution is reached, it is viewed with suspicion by the layperson. Modern resource
information systems provide an alternative approach involving human rationalization and
tradeoffs. This process involves statements like, "If you let me harvest this parcel, I will let you
set aside that one as a wildlife preservation." The statement is followed by a persuasive argument
and group discussion. The dialogue is far from a mathematical optimization, but often closer to an
effective decision. It uses the information system to focus discussion away from broad
philosophical positions, to a specific project area and its unique distribution of conditions and
Natural resources planning and management have always required information as their
cornerstone. Early resource information systems relied on physical storage of data and manual
processing. With the advent of the computer, most of these data and procedures have been
automated during the past two decades. As a result, resource information processing has
increasingly become more quantitative. Systems analysis techniques developed links between
descriptive data of the forest landscape to the mix of management actions that maximizes a set of
objectives. This mathematical approach to natural resources management has been both
stimulated and facilitated by modern information systems technology. The digital nature of
mapped data in these systems provides a wealth of new analysis operations and a comprehensive
ability to spatially model complex resource issues. The full impact of the new data form and
analytical capabilities is yet to be determined. Resource professionals are challenged to
understand this new environment and formulate new applications. It is clear that geographic
information systems have, and will continue, to revolutionize the resource decision-making
Author's Notes: This paper is based on topics discussed in Beyond Mapping: Concepts,
Algorithms, and Issues in GIS (J.K. Berry, GIS World Books, 1993) and Spatial Reasoning for
Effective GIS (J.K. Berry, GIS World Books, 1995) available through the Internet at
www.amazon.com. Additional papers, presentations and other materials on GIOS applications in
natural resources are available online at www.innovativegis.com/basis.