47. Thank you! Presenter: Andrew Harrison, GISP Business Development Manager The Schneider Corporation (317) 826-7393 [email_address]
Editor's Notes
Fancy software does not equal accurate data – main point (your software doesn’t dictate your accuracy) Additional data layers will only be as accurate as basemap – layers built on top of orthos, plss (in this part of the country) and parcels will only be as good as the basemap
This county held an attribute in their parcel database that showed which watershed it was (mostly) in In reality they probably looked something like his.
What is needed? We utilized ArcGIS, spatial analyst and a hydrology modeling extension downloaded from ESRI scripts website.
2 foot contours used as the base data for elevation
Linear stream features were tabulated from local, regional and state sources
A raster was created from the two foot contours. Lighter areas are higher elevation QC process: local knowledge (water in this area flows to the NW) There’s a break in the SE.
Flow direction is created from elevation raster, tells that direction that water will flow. Uses 20’ cells. The GIS calculates what direction the majority of water will from the cell. Water accumulates at the center where it all flows NW (noted by lines of pixels side by side)
This raster shows where water will accumulate as it flows, lighter areas are “deeper”
Raster stream layers is made from the vector streams layer. This is done so that the rasters can be calculated against each other. Lighter is higher elevation
Rasters match up with each other, trends (NW flow and SE break) evident in each.
Joke from an Indiana surveyor, what he heard his first day on the job
Image from ArcGIS help – it’s good for something! GIS is only as intelligent as you make it. In calculating the watershed little inconsistencies in the elevation data (peaks & sinks) would create an obstacle and we use this process to “fill” them. This process forces water to flow “somewhere” off the map (an assumption)
Another obstacle, “culverts”. The source contours show this at a an elevation consistent with the road even though a culvert was put in place to allow for water flow. We must account for this culvert.
How do we take care of that? Look at the raster value for this elevation (892 feet).
We can “burn in the stream” or force water to flow to the stream by calculating an exaggerated amount (negative 1000 feet). Not “real” but gets the job done.
Next we tell the extension a threshold for how many watersheds we want, in this case 500.
We can up that to 5000, which uses the same model to define more watersheds (in effect sub-watersheds), ironically it takes less time to create more. So its easier to start with a big number and scale back.
Next we matched the watersheds to their administrative watersheds (remember these)
Starts to take shape, administrative/computer generated watersheds
Sub watersheds – they recognize some legally but not all from the 5000
And more…
And more…
Yep, more…
Shows all watersheds in the county. Inside each watershed you can create one or more “projects” that you may assess for.
Opening a project, you can define how it will be assessed. In this case, it was a reconstruction of the a ditch. The allocation mode chosen was “proportion of total” so it will figure out the number of benefited acres for each property owned in the watershed and what proportion of the total watershed acres that is, it will then calculate what the corresponding percentage of the project cost ($150,000). Other options would be to just assess based on a rate per acre ($3/acre). The last option is rate per point (which allows you to ramp the cost up and down based on criteria). All are based originally on benefited acres.
Each parcel in the watershed is represented. Data pulled from various sources (CAMA, GIS, etc). Watersheds acres is “benefited acres”. “ $ total” is what that land owner will be assessed. Dollar/$Acre goes back to “rate per acre” method. Totals to 13,169+ acres, $150,000 (see earlier slide)
Influence the assessment using “assessment methods” The combination of four parts: transform method , operation, target and aggregator. Transform method determines how the raw input value will be turned into a point value. options include: look up table (assign a weight in points to some or all the values in a data source field), fixed value (designates a value that will overrule any other factors), identity (the value in the data source is literally the point value), linear (applies a common assessing formula y=(M*x)+B where y is the output, x is the input, M is the scale and B is the offset) and inverse (applies an inverse assessing formula y=(M/X)+B). The operation setting establishes what will be done with the point value once it has been determined. Operation setting options include: sum (output of the transform method added to the running total of points), factor (output of the transform method multiplied by the running total of points), and minimum (running total of points can be no less than the output of the transform method ). The target setting determines whether the output of the transform method and operation is applied to the running point total (score) or the actual dollar value (dollars) of the assessment. The aggregator setting determines how the rule will handle cases where multiple data values exist for single parcels (i.e. multiple soil types with different values are covered by one parcel). Aggregator options include: none (one value is expected, no aggregation needed), average (computes the straight average of all the scores), area weighted average (gives more weight to the point scores of larger areas), sum (adds all point values together), sum of acreages (adds up acreages and uses that sum as the point total).
Apply them here, use one, use many. Different weights can be established.
After the assessment is calculated you can utilize crystal reports to create reports