This document describes the process of dasymetric mapping to more accurately distribute population data spatially. It provides an example of mapping Wisconsin's population based on census tracts, which often do not accurately reflect population density variations within the tracts. Land cover data is used as the controlling variable to redistribute populations from census tracts to zones estimated to have different population densities based on land cover type, through a regression analysis. While an improvement over simple census tract mapping, issues include the land cover data not being a perfect proxy for population distribution.
GEOGRAPHIC SKILLS: CHOROPLETH MAPS.
A choropleth map is a thematic map in which areas are shaded or patterned in proportion to the measurement of the statistical variable being displayed on the map, such as population density or per-capita income.
GEOGRAPHIC SKILLS: CHOROPLETH MAPS.
A choropleth map is a thematic map in which areas are shaded or patterned in proportion to the measurement of the statistical variable being displayed on the map, such as population density or per-capita income.
The objective of image classification is to classify each pixel into only one class (crisp or hard classification) or to associate the pixel with many classes (fuzzy or soft classification). The classification techniques may be categorized either on the basis of training process (supervised and unsupervised) or on the basis of theoretical model (parametric and non-parametric).
Unsupervised classification is where the groupings of pixels with common characteristics are based on the software analysis of an image without the user providing sample classes. The computer uses techniques to determine which pixels are related and groups them into classes. The user can specify which algorism the software will use and the desired number of output classes but otherwise does not aid in the classification process. However, the user must have knowledge of the area being classified when the groupings of pixels with common characteristics produced by the computer have to be related to actual features on the ground (such as waterbodies, developed areas, forests, etc.).
Supervised classification is based on the idea that a user can select sample pixels in an image that are representative of specific classes and then direct the image processing software to use these training sites as references for the classification of all other pixels in the image. Input classes are selected based on the knowledge of the user. The user also sets the bounds for how similar other pixels must be to group them together. These bounds are often set based on the spectral characteristics of the input classes (AOI), plus or minus a certain increment (often based on “brightness” or strength of reflection in specific spectral bands). The user also designates the number of classes that the image is classified into.
THIS PRESENTATION IS TO HELP YOU PERFORM THE TASK STEP BY STEP.
Map is a drawn or printed representation of the physical features of the Earth.
It is the best tool to show, understand and analyse the features of an area. Cartography is the art and science of making maps. This module highlights many information on maps, types and their uses.
Geography is a spatial science and a 'space' has multiple dimensions to describe its characteristics in terms of the habitat, economy and society of man. Therefore, for practical purposes of spatial data analysis, we need to perform sampling techniques to identify units of survey at a certain level of probability of significance.
Thematic Cartography
Concept of Thematic Cartography: Importance of Thematic Maps.
Diagrammatic Data Presentation - Line graph- Simple, Polygraph, Combined Line Graph. Band Graph. Climograph, Hythergraph, Erograph".
Representation of Data: Choropleth, Isopleths, Dot & Point Data
Cartography is the art, science and technology of map making.
Maps are used as research tools and as sources of information.
Maps have existed since the time of the Egyptian, Mesopotamian and Chinese civilizations, with the latter maps dating back to 6000 years.
The objective of image classification is to classify each pixel into only one class (crisp or hard classification) or to associate the pixel with many classes (fuzzy or soft classification). The classification techniques may be categorized either on the basis of training process (supervised and unsupervised) or on the basis of theoretical model (parametric and non-parametric).
Unsupervised classification is where the groupings of pixels with common characteristics are based on the software analysis of an image without the user providing sample classes. The computer uses techniques to determine which pixels are related and groups them into classes. The user can specify which algorism the software will use and the desired number of output classes but otherwise does not aid in the classification process. However, the user must have knowledge of the area being classified when the groupings of pixels with common characteristics produced by the computer have to be related to actual features on the ground (such as waterbodies, developed areas, forests, etc.).
Supervised classification is based on the idea that a user can select sample pixels in an image that are representative of specific classes and then direct the image processing software to use these training sites as references for the classification of all other pixels in the image. Input classes are selected based on the knowledge of the user. The user also sets the bounds for how similar other pixels must be to group them together. These bounds are often set based on the spectral characteristics of the input classes (AOI), plus or minus a certain increment (often based on “brightness” or strength of reflection in specific spectral bands). The user also designates the number of classes that the image is classified into.
THIS PRESENTATION IS TO HELP YOU PERFORM THE TASK STEP BY STEP.
Map is a drawn or printed representation of the physical features of the Earth.
It is the best tool to show, understand and analyse the features of an area. Cartography is the art and science of making maps. This module highlights many information on maps, types and their uses.
Geography is a spatial science and a 'space' has multiple dimensions to describe its characteristics in terms of the habitat, economy and society of man. Therefore, for practical purposes of spatial data analysis, we need to perform sampling techniques to identify units of survey at a certain level of probability of significance.
Thematic Cartography
Concept of Thematic Cartography: Importance of Thematic Maps.
Diagrammatic Data Presentation - Line graph- Simple, Polygraph, Combined Line Graph. Band Graph. Climograph, Hythergraph, Erograph".
Representation of Data: Choropleth, Isopleths, Dot & Point Data
Cartography is the art, science and technology of map making.
Maps are used as research tools and as sources of information.
Maps have existed since the time of the Egyptian, Mesopotamian and Chinese civilizations, with the latter maps dating back to 6000 years.
presentatie inzetten op transformatie en/of integratie voor begeleidingscommissie van het onderzoeksproject DieGem (www.solidariteitindiversiteit.be) van 1 april 2015.
Tower Automotive Gent, een assemblagebedrijf in de Gentse haven levert "just in time" onderdelen aan Volvo Cars. De culturele diversiteit onder de werknemers vormt een hele uitdaging. Vakbondsafgevaardigden en HRM-medewerkers gaan die uitdaging aan en smeden vertrouwensbanden met en tussen werknemers.
The wildly creative and inspirational dance company Wonderbound – Boomtown presents LTAC debut in Colorado-themed performance featuring Chimney Choir.The show is organized at Lone Tree Arts Center on Saturday, April 25, 2015 at 8:00 PM. For more info visit http://lonetreeartscenter.org/showinfo.php?id=195
Startup life cycle from acceleration stage to launching and growth. A collection of stakeholders and market players affecting the investment, growth, user base, and acquisition.
TYBSC IT PGIS Unit I Chapter II Geographic Information and Spacial DatabaseArti Parab Academics
Geographic Information and Spatial Database Models and Representations of the real world Geographic Phenomena: Defining geographic phenomena, types of geographic phenomena, Geographic fields, Geographic objects, Boundaries Computer Representations of Geographic Information: Regular tessellations, irregular tessellations, Vector representations, Topology and Spatial relationships, Scale and Resolution, Representation of Geographic fields, Representation of Geographic objects Organizing and Managing Spatial Data The Temporal Dimension
This is the PowerPoint prepared by Dawn Davies (Hill Country Alliance) for Texas Public Radio's Think Earth event held on October 7, 2022. The slide presentation focuses on light pollution, and matches with the audio on this page: https://www.tpr.org/tpr-events-initiatives/2022-09-28/think-earth-pollution
We propose a game where two players take turns assigning precincts to districts. In a simplified setting where districts have no geographic constraints, both players have a strategy that allows them to win a number of districts proportional to their number of voters. For the game in real maps (with geographic constraints) we are developing a player based on neural networks and reinforcement learning that aims to learn how to optimally play this game through self-play (inspired by AlphaZero). As in other simulations-based gerrymandering research, the difficulty in this approach is the size of the problem. In fact, we show that the problem of deciding whether
there exists a 'fair map' in the set of 'legal maps' (for appropriate simple definitions of 'legal' and 'fair') is actually NP-complete.
Data Visualization GIS and Maps, The Visualization Process Visualization Strategies: Present or explore? The cartographic toolbox: What kind of data do I have?, How can I map my data? How to map?: How to map qualitative data, How to map quantitative data, How to map the terrain elevation, How to map time series Map Cosmetics, Map Dissemination
Local figure–ground cues are valid for natural imagesDepartm.docxsmile790243
Local figure–ground cues are valid for natural images
Department of Electrical Engineering and Computer Science,
University of California at Berkeley, Berkeley, CA, USACharless C. Fowlkes
Computer Science Department, Boston College,
Boston, MA, USADavid R. Martin
Department of Electrical Engineering and Computer Science,
University of California at Berkeley, Berkeley, CA, USAJitendra Malik
Figure–ground organization refers to the visual perception that a contour separating two regions belongs to one of the
regions. Recent studies have found neural correlates of figure–ground assignment in V2 as early as 10–25 ms after
response onset, providing strong support for the role of local bottom–up processing. How much information about figure–
ground assignment is available from locally computed cues? Using a large collection of natural images, in which
neighboring regions were assigned a figure–ground relation by human observers, we quantified the extent to which figural
regions locally tend to be smaller, more convex, and lie below ground regions. Our results suggest that these Gestalt cues
are ecologically valid, and we quantify their relative power. We have also developed a simple bottom–up computational
model of figure–ground assignment that takes image contours as input. Using parameters fit to natural image statistics, the
model is capable of matching human-level performance when scene context limited.
Keywords: figure–ground, ecological statistics, natural scenes, perceptual organization
Citation: Fowlkes, C. C., Martin, D. R., & Malik, J. (2007). Local figure–ground cues are valid for natural images. Journal of
Vision, 7(8):2, 1–9, http://journalofvision.org/7/8/2/, doi:10.1167/7.8.2.
Introduction
In the 1920s, the Gestalt psychologists identified group-
ing and figure–ground as two major principles underlying
the process of perceptual organization. Grouping describes
the way that individual elements of a stimulus come
together to form a perceptual whole. Figure–ground refers
to the perception that a contour separating two regions
“belongs” to one of the two regions. The figural region
takes on shape imparted by the separating contour and
appears closer to the viewer, whereas the ground region is
seen as extending behind the figure. Both grouping and
figure–ground are thought to be important in reducing the
visual complexity of a scene to that of processing a small
number of cohesive, nonaccidental units.
Starting with Rubin (1921), who first pointed out the
significance of figure–ground organization, a long list of
factors that affect figure–ground assignment have been
identified. These include size, surroundedness, orientation,
and contrast (Rubin, 1921), as well as symmetry (Bahnsen,
1928), parallelism (Metzger, 1953), convexity (Kanizsa &
Gerbino, 1976; Metzger, 1953), meaningfulness (Peterson,
1994), and lower region (Vecera, Vogel, & Woodman,
2002).
How might these cues be computed in the brain? It is
conceivable that cues su ...
Individual movements and geographical data mining. Clustering algorithms for ...Beniamino Murgante
Individual movements and geographical data mining. Clustering algorithms for highlighting hotspots in personal navigation routes.
Giuseppe Borruso, Gabriella Schoier - University of Trieste
Similar to Population Density Mapping using the Dasymetric Method (20)
"Discovering Historic Wisconsin Geospatial Data." May 2023 presentation for Wisconsin Wetland Association "Wetland Coffee Break" by Howard Veregin, Wisconsin State Cartographer.
Wisconsin State Cartographer's Office (SCO) presentation to Wisconsin Society of Land Surveyors (WSLS) August 2019 describing SCO online web mapping apps of use to surveyors.
How Wisconsin's GIS, geospatial and surveying communities are preparing for the new National Geodetic Survey (NGS) datum, NATRF2002 (North American Terrestrial Reference Frame of 2022).
Talk given at the 2015 Fall Regional in Oshkosh WI.
"An Approach to Address Parsing and Data Standardization"
Abstract:
Maintaining fully parsed address elements in your database can be one of the most beneficial steps toward
achieving quality and consistency in addressing. Parsed address elements also serve a preparatory step in
modeling an address toward NG9-1-1 supporting formats such as the FGDC address standard. In this talk,
we’ll take a look at the approach we’ve used for parsing site addresses for the V1 Statewide Parcel Map, the
role regular expressions played in this approach, and will unveil a suite of (free) ArcPy tools that can help you
parse addresses, standardize field values, and achieve other tasks.
Presenters:
Codie See
David Vogel
What's the status of the NSDI?
Cowen's address will provide his perspective on the current status of the National Spatial Data Infrastructure (NSDI). He will draw from his extensive experience with the National Research Council’s Mapping Science Committee, chairing the NRC study National Land Parcel Data: A Vision for the Future, a recent term as chair of the National Geospatial Advisory Committee, and his service as vice chairman of the Coalition of Geospatial Organizations (COGO) Report Card Committee on the NSDI. Through these activities he has observed and analyzed the Federal geospatial landscape for the thirty years since president Clinton issued Executive Order 12906, Coordinating Geographic Data Acquisition and Access: The National Spatial Data Infrastructure in 1994. He will comment on the changing role of various stakeholders in the collection, maintenance and sharing of geospatial data.
Carrying through on the promise: Lessons from protoype efforts in Wisconsin thirty years ago
In the 1980s, Wisconsin served as a testbed for certain concepts in the modernization of land records. One key project was the Dane County Land Records Project that assembled partners at county, state and federal government to collaborate with University and industry to try out a bunch of untested technologies. This project established some alternative pathways to the multipurpose cadastre, some concepts that contributed to the rethinking of spatial data infrastructures, and of course delivered a working solution for a statewide soil erosion planning problem. This presentation will return to some of the key issues of thirty years ago that remain crucial in the current era. Incrementalism: there are current sources in use that need to be upgraded to remain useful. Multilateral collaboration: Any solution must mobilize many actors, and provide appropriate roles for each. Technological trials: without full-scale realistic applications the promise of technology remains just a promise. The presentation will link the issues of 1985 (moment at which the Wisconsin Land Information Program was launched) to those of 2015. Important elements remain central.
More from Wisconsin State Cartographer's Office (11)
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Population Density Mapping using the Dasymetric Method
1.
2. Problem: Mapping units may be a poor match to the spatial distribution
of the phenomenon being mapped.
Example: Distribution of cropland variable within a county.
Solution: Reapportion the variable spatially based on knowledge (or
assumptions) about its spatial distribution.
Example: Reapportion total county cropland to zones based on land use.
Limiting or related variable: Variable that controls, that we think has an
effect on, or is statistically related to, the phenomenon being mapped.
Example: Land use (for cropland).
Why do this? Should produce a better map, with more accurate spatial
distribution of the phenomenon.
What is Dasymetric Mapping?
4. Early Example
Source:
John K. Wright, “A Method
of Mapping Densities of
Population: With Cape Cod
as an Example,”
Geographical Review, Vol.
26, No. 1 (Jan., 1936), pp.
103-110.
6. Goal: Create dasymetric map of Wisconsin population based on census
tract populations.
~1400 census tracts in Wisconsin
Tracts can be quite large
Population density not uniform within tracts
We’d like a better map
Wisconsin Case Study
8. Why not used census blocks?
Blocks are much smaller than tracts, on average.
But in rural areas, blocks can still be very large.
A problem when there are population concentrations in rural areas,
such as unincorporated communities.
Wisconsin Case Study
11. Controlling variable:
Landcover
NLCD 2006…most current at the time
Resolution of 30 meters
Maybe not the best choice…more later
Starting point is to intersect the tract layer and the land cover layer
Wisconsin Case Study
12.
13.
14. Let Pi,j be the population of polygon “i,j” (formed by
intersection of tract i and land cover polygon j)
Dasymetric Equation
15. Then
Pi,j = dj x ai,j
where
dj = population density of land cover polygon j
ai,j = area of polygon “i,j”
Dasymetric Equation
16. Note that dj (population density of land cover polygon j)
depends on the land cover class of polygon j.
The most complex part of dasymetric mapping is estimating
population densities for each land cover class.
To generate estimates, use the polys
created by intersected tract and land
cover layers to get the ai,j values.
Sum all areas within each tract that
belong to the same land cover class.
Density Estimation
17. Set up the following regression model:
Pi = ď1 x ai,1 + ď2 x ai,2 + … + ďK x ai,K
Where:
Pi = observed population of tract i
ai,k = observed area of all land cover polys within tract i
for which the land cover class = k
ďk = pop density for cover class k (unknown coefficients)
K = number of unique land cover classes
Density Estimation
18. Analogous to “hedonic” regression, the classic example of
which is: estimate the increase in market value (in $) of
specific characteristics of a home (bathroom, deck, garage…).
In our case, estimate the increase in population of a tract
associated with a unit-area increase in each land cover class.
Many ways to implement this. We use Generalized Reduced
Gradient (GRG) optimization to constrain ďk > 0.
Regression Analysis
24. Land cover data not a good proxy for population density.
NLCD includes transportation in “low intensity developed” causing population
to be reapportioned from census tracts to the transportation network.
NLCD mixes residential and non-residential in “high intensity developed”
causing high-population areas to be mixed with low-population areas such as
malls and parking lots.
Should try to combine land cover with land use, zoning, parcels…
Data-intensive. Even after simplification, the intersection of
tracts and land cover generates about 3 million polys.
Small-scale depiction of the population distribution over the
state; not accurate for large-scale mapping.
Issues
25. CREDITS
RESEARCH TEAM
Blaine Hackett
Co-Founder and President, Flat Rock Geographics
Tom Cox
Minnesota Power; formerly a UW-Madison student
Howard Veregin
Wisconsin State Cartographer
PHOTO CREDITS
Making a map for the blind. Stefan Kühn
(http://hdl.loc.gov/loc.pnp/ggbain.19023)
[Public domain], via Wikimedia Commons.
Image of an officer and soldier making maps in France, 1917-1918.
US Army Signal Corps (US Army Center of Military History, Carlisle, PA)
[Public domain], via Wikimedia Commons
Staff Sergeant Blake Ellis, Sheel Creek, Tennessee, inking in the pencil tracings.
Culture, Hydrography, and Contours are shown. England , 01/11/1943. Department of Defense.
Department of the Army. Office of the Chief Signal Officer. (09/18/1947 - 02/28/1964) [Public
domain], via Wikimedia Commons