2. URISA GIS Capability
Maturity Model
GIS Capability Maturity Model (GISCMM) is
comprehensive assessment of GIS management
capability maturity.
The assessment measures your organization’s GIS
maturity on a wide range of geospatial capabilities.
People, Data, Processes, Policy, Technology & Legal
Urban and Regional Systems Association, URISA has
been at the forefront of government and
business applications of GIS for almost
50 years. GISCMM was developed 2010.
3. How the GISCMM Works
The GISCMM is divided into two main areas called
Enabling Capability (EC) and Execution Ability (EA).
(EC) - The technology, data, resources, and related
infrastructure that can be bought, developed, or
acquired to support enterprise GIS operations.
(EA) - The degree to which an enterprise achieves faster,
better, cheaper results through clarity in planning.
Ensures that the investment decisions and project
deliverables are connected and monitored for
sustainable business results.
4. Methodology
Stakeholders complete self-assessment survey.
ITS compiled all completed surveys into a combined
result.
The combined result reviewed and analyzed.
Evaluate Results.
Survey results posted online at URISA’s GIS
Management Institute site.
5. Participants Surveyed
Engaged Stakeholders from:
Building and Development Services
Community Services
Information Technology Services
Neighborhood Services
Property Appraiser’s Office
Property Management
Public Works
Public Safety
Redevelopment and Economic Opportunity
Utilities
8. Analysis Performed
Standard Deviation – a quantity calculated to
indicate the extent of the deviation for a group of
values.
A low standard deviation indicates little variation in
survey responses while a high standard deviation
indicates a wider variance in survey responses.
Average – a number expressing the central or
typical value in a data set.
A low average indicates poor performance in
a specific metric while a higher average
indicates higher performance.
9. Performance Matrix
Using the Std. Dev. and average we developed a
performance matrix.
Std. Dev. Average Result Benefit / Resource
Low Low = Everyone doing poorly Can only improve / no local
knowledge (ESRI Credits)
High Low = Most doing poorly, a few
doing well
Lots of benefit / Some local
help available
High High = Most doing well, a few doing
poorly
Some benefit / Lots of local
help available
Low High = Everyone is doing well None / Desired maturity
10. (EC) Analysis Performed
Calculated the average standard deviation to be
0.253 and used that to delineate between a low
and high std. dev.
Calculated the average of the averages to be .79
and used that to delineate between a low and high
average.
Applied Low / High ratings to the various (EC)
metrics based on our performance matrix
11. (EA) Analysis Performed
Calculated the average standard deviation to be
1.21 and used that to delineate between a low and
high std. dev.
Calculated the average of the averages to be 3.25
and used that to delineate between a low and high
average.
Applied Low / High ratings to the various (EA)
metrics based on our performance matrix
14. Enabling Capability (EC)
Results
There were issues in respondent’s interpretation of
questions groups EC1 – EC4 which dealt with
specific data layers.
Focused on EC5 – EC23
15. Negative (EC) Results
Bad
EC5 – GIS Data Coordination
EC6 – Metadata
EC20 – GIS Linked to Strategic Goals
EC22 – GIS Funding
EC23 – GIS Financial Plan
Very Bad
EC19 – GIS Governance Structure
16. Positive (EC) Results
Good
EC15 – GIS Application Portfolio O&M
EC21 – GIS Budget
Very Good
EC7 – Spatial Data Warehouse
EC8 – Architecture Design
EC9 – Technical Infrastructure
EC10 – Replacement Plan
EC11 – GIS Software Maintenance
EC12 – Data backup and security
EC13 – GIS Application Portfolio
EC14 – GIS Application Portfolio Management
EC16 – Professional GIS Management
EC17 – Professional GIS Operations Staff
EC18 – GIS Staff Training and Professional Development
18. Execution Ability (EA) Results
There was more opportunity for growth on this side
of the GISCMM
EC was stronger due in large measure to the
maturity of supporting IT processes like:
Tech refresh
Data backups and recovery
Robust networking
Virtualization
Network Accessible Storage
19. Negative (EA) Results
Bad
EA1 – New Client Services Evaluation and Development
EA3 – Service Delivery Tracking and Oversight
EA4 – Service Quality Assurance
EA5 – Application Development or Procurement
EA7 - Quality Assurance and Quality Control
EA13 – Operation Performance Management
EA16 – Resource Allocation Management
Very Bad
EA2 – User Support; Help Desk and End-User Training
EA11 – Regional Collaboration
EA12 – Staff Development
EA13 – Client Satisfaction Monitoring and Assurance
20. Positive (EA) Results
Good
EA9 – Process Event Management
EA14 – Individual GIS Staff Performance Management
Very Good
EA6 – Project Management Methodology
EA8 – GIS System Management
EA10 – Contract and Supplier Management
EA17 – GIS Data Sharing
EA18 – GIS Software License Sharing
EA19 – GIS Data Inter-operability
EA20 – Legal and Policy Affairs Management
EA21 – Balancing minimum privacy with max data usage
EA22 – Service to the community and profession
22. Summary Conclusions
This GISCMM effort and the resulting analysis confirm
what felt we knew anecdotally and provides the data to
back that up.
Major Pain Points
Lack of overall governance (EC)
Lack of Data Coordination (EC)
Lack of User Support; Service Desk and End User Training (EA)
Service Delivery Tracking and Quality Assurance (EA)
Staff Development (EA)
Client Satisfaction Monitoring (EA)
23. Summary Conclusions Cont.
Our Enabling Capability (EC) is stronger than our
Execution Ability (EA). In other words, we have more
capability than we can effectively use.
Distributed GIS has inherent disadvantages regarding
(EA). GIS being federated within different organizations
creates inconsistencies in how work is tracked, projects
managed, resources allocated etc.
Lack of Governing body creates a lack of general
direction for the enterprise.
GIS provides good service despite the challenges
which I believe is a testament to the
professionalism of GIS staff in all quarters the
enterprise.
24. Summary Conclusions Cont.
Departments look to ITS to fill gaps in maintaining
data that is not specifically “owned” by another
department, providing technical GIS leadership or
assistance when needed and plugging gaps in
funding by passing the hat as needed.
This first run of the GISCMM will help us better
understand the questions and goals improving
result consistency on future GISCMM surveys.
25. Recommendations
Conduct customer satisfaction surveys regularly.
Brainstorm other ways to monitor customer
satisfaction.
Coordinate some form of in-house training.
Consider adding a Learning and Service Credits to the
next ESRI ELA.
Engage with ESRI to develop centralized governance of
Portal, AGOL and the overall Enterprise
Consider replacing hardware utilized by GIS personnel
throughout the county at one time vs. on a
by dept. rotation more frequently.
26. Recommendations Cont.
Work with CSC to try create greater GIS issue
resolution capability with the CSC.
Consider consolidating GIS resources under
centralized governance with an eye towards
alignment with strategic business goals.
28. Sad Footnote - 8/23/19
I did want to announce to you the end of the GIS Management Institute. One of
the most important tasks of the Board of Directors is to constantly evaluate all of
URISA’s programs to ensure we are providing the best opportunities for our
members to reach their top potential. When the GIS Management Institute (GMI)
was created, the original intent was to provide a comprehensive program that
would facilitate and support GIS managers with the online GIS Capability Maturity
Model and several more initiatives to help organizations with their GIS
management operations. These include:
Development of GIS Management Best Practices and Body of Knowledge
GIS Management-related professional education (GIS Leadership Academy and a robust
GIS Leadership & Management program track at GIS-Pro each year)
Expansion of GIS Management information dissemination
Promotion of awareness and adoption of GIS management best practices by organizations
and GIS professionals
Sincerely,
Kim McDonough, GISP, Tennessee Department of Transportation
URISA President
29. The End!
Questions?
Special Thanks to:
John Sharp – Excel Jedi
Jeff Pace – Stats Man
Sharon Wallace – Respondent Wrangler
Bryan Townsend – GISCMM Coach