Is your organization committing some of the most common data collection mistakes? Uncover asset data collection basics while highlighting common missteps municipalities make along the way.
Growing a garden takes a lot of effort. Only after a lot of work do you see the fruits of your labor. A database is much the same, requiring constant upkeep and maintenance to ensure you have useful data to “harvest.” Presented at TSAE Tech Talk 2016.
Audit: Breaking Down Barriers to Increase the Use of Data AnalyticsCaseWare IDEA
The document discusses breaking down barriers to increase the use of data analytics in auditing. It outlines how NASDAQ uses data analytics in its internal auditing processes. It identifies common barriers like lack of skills and technology experience. It recommends starting small with basic analytics and focusing on business objectives. The document provides examples of using analytics for email logs, access reviews, alerts and other auditing purposes. It emphasizes gaining management support and measuring effectiveness to expand analytics use.
The document discusses data collection in computer science. It describes how data collection involves gathering data from various sources and storing it centrally. Data collection can be done manually or automatically from sources like websites, databases and sensors. The goal of data collection is to obtain accurate information to facilitate analysis. Key steps in data collection include identifying research issues, data requirements, data sets, and collection methods. The document discusses primary and secondary data collection methods and their advantages and disadvantages. Common challenges in data collection include data quality issues, finding relevant data, deciding what to collect, and dealing with big data.
This document discusses how data and machine learning systems work, and some of their limitations. It makes three key points:
1. Machine learning systems are only as good as the data used to train them, and all data has some inherent bias which can negatively impact results if not addressed.
2. While large datasets and machine learning are powerful, humans still need to provide oversight to catch errors, prevent harm, and ensure systems don't behave in unexpected ways.
3. Thorough testing of systems with diverse datasets is needed to identify and address biases, anticipate problems, and ensure models are robust and represent their intended domains.
Digital information is doubling in size within your organization at a minimum of 24 months. Understanding what types of information you have, where it is stored and how to determine its importance to you organization the first step. Join our webinar on "5 Steps to Curbing Information Sprawl" to hear about the steps you can take to help minimize information sprawl across your organization.
Some of the focused information locations will be, but not limited to: Email, file shares, SharePoint and line of business systems and applications.
This chapter discusses techniques for discovering requirements for new systems. It defines requirements and differentiates between functional and non-functional requirements. It then describes several fact-finding techniques analysts can use to identify requirements, including sampling documentation, observation, interviews, questionnaires, prototyping, and joint requirements planning sessions. It emphasizes the importance of documenting requirements and managing changes to requirements over the system development lifecycle.
Audit Webinar: Surefire ways to succeed with Data AnalyticsCaseWare IDEA
While the majority of executives and internal audit leaders agree that data analytics is important, according to the 2016 IIA CBOK study, only 40% of respondents are using technology in audit methodology. Why the disconnect?
In this webinar, we identify some of the common challenges associated with starting and continuing to use data analytics in your audit process. Easy-to-implement methods that help expand the use of data analytics and improve your audit coverage are also presented.
SLIDESHARE: www.slideshare.net/CaseWare_Analytics
WEBSITE: www.casewareanalytics.com
BLOG: www.casewareanalytics.com/blog
TWITTER: www.twitter.com/CW_Analytic
While the majority of executives and internal audit leaders agree that data analytics is important, according to the 2016 IIA CBOK study, only 40% of respondents are using technology in audit methodology. Why the disconnect?
In this webinar, we will identify some of the common challenges associated with starting and continuing to use data analytics in your audit process. Easy-to-implement methods that help expand the use of data analytics and improve your audit coverage will also be presented.
Learning objectives
• Discuss ways to increase and expand the use of data analytics, including business and technology applications
• Identify the skills needed for successful use of data analytics
• Provide guidance on obtaining internal management support
• Offer tips on how to measure staff utilization and the effectiveness of analytics during audits
For information on our Webinars visit AuditNet.org (www.auditnet.org)
Growing a garden takes a lot of effort. Only after a lot of work do you see the fruits of your labor. A database is much the same, requiring constant upkeep and maintenance to ensure you have useful data to “harvest.” Presented at TSAE Tech Talk 2016.
Audit: Breaking Down Barriers to Increase the Use of Data AnalyticsCaseWare IDEA
The document discusses breaking down barriers to increase the use of data analytics in auditing. It outlines how NASDAQ uses data analytics in its internal auditing processes. It identifies common barriers like lack of skills and technology experience. It recommends starting small with basic analytics and focusing on business objectives. The document provides examples of using analytics for email logs, access reviews, alerts and other auditing purposes. It emphasizes gaining management support and measuring effectiveness to expand analytics use.
The document discusses data collection in computer science. It describes how data collection involves gathering data from various sources and storing it centrally. Data collection can be done manually or automatically from sources like websites, databases and sensors. The goal of data collection is to obtain accurate information to facilitate analysis. Key steps in data collection include identifying research issues, data requirements, data sets, and collection methods. The document discusses primary and secondary data collection methods and their advantages and disadvantages. Common challenges in data collection include data quality issues, finding relevant data, deciding what to collect, and dealing with big data.
This document discusses how data and machine learning systems work, and some of their limitations. It makes three key points:
1. Machine learning systems are only as good as the data used to train them, and all data has some inherent bias which can negatively impact results if not addressed.
2. While large datasets and machine learning are powerful, humans still need to provide oversight to catch errors, prevent harm, and ensure systems don't behave in unexpected ways.
3. Thorough testing of systems with diverse datasets is needed to identify and address biases, anticipate problems, and ensure models are robust and represent their intended domains.
Digital information is doubling in size within your organization at a minimum of 24 months. Understanding what types of information you have, where it is stored and how to determine its importance to you organization the first step. Join our webinar on "5 Steps to Curbing Information Sprawl" to hear about the steps you can take to help minimize information sprawl across your organization.
Some of the focused information locations will be, but not limited to: Email, file shares, SharePoint and line of business systems and applications.
This chapter discusses techniques for discovering requirements for new systems. It defines requirements and differentiates between functional and non-functional requirements. It then describes several fact-finding techniques analysts can use to identify requirements, including sampling documentation, observation, interviews, questionnaires, prototyping, and joint requirements planning sessions. It emphasizes the importance of documenting requirements and managing changes to requirements over the system development lifecycle.
Audit Webinar: Surefire ways to succeed with Data AnalyticsCaseWare IDEA
While the majority of executives and internal audit leaders agree that data analytics is important, according to the 2016 IIA CBOK study, only 40% of respondents are using technology in audit methodology. Why the disconnect?
In this webinar, we identify some of the common challenges associated with starting and continuing to use data analytics in your audit process. Easy-to-implement methods that help expand the use of data analytics and improve your audit coverage are also presented.
SLIDESHARE: www.slideshare.net/CaseWare_Analytics
WEBSITE: www.casewareanalytics.com
BLOG: www.casewareanalytics.com/blog
TWITTER: www.twitter.com/CW_Analytic
While the majority of executives and internal audit leaders agree that data analytics is important, according to the 2016 IIA CBOK study, only 40% of respondents are using technology in audit methodology. Why the disconnect?
In this webinar, we will identify some of the common challenges associated with starting and continuing to use data analytics in your audit process. Easy-to-implement methods that help expand the use of data analytics and improve your audit coverage will also be presented.
Learning objectives
• Discuss ways to increase and expand the use of data analytics, including business and technology applications
• Identify the skills needed for successful use of data analytics
• Provide guidance on obtaining internal management support
• Offer tips on how to measure staff utilization and the effectiveness of analytics during audits
For information on our Webinars visit AuditNet.org (www.auditnet.org)
This document provides an agenda for a presentation on best practices for data collection and use. It discusses collecting the right types of data, maintaining data quality, using data for wealth screenings and segmentation, and addressing obstacles like cost, return on investment, and data management challenges. Case studies are presented on how different organizations have leveraged data collection and analysis. The importance of big data and using available online data is also covered.
NTEN Webinar - Data Cleaning and Visualization Tools for NonprofitsAzavea
Slides from a webinar we conducted for NTEN that covers tools that nonprofits can use to clean and prepare their datasets and then visualize them via charts, maps, and graphs.
Bringing agility to big data applications can be challenging for several reasons:
1) There is a very long feedback cycle when developing workable software applications due to the time needed to wait for quality data.
2) Switching costs between tasks and projects are high for big data applications.
3) Releasing features to all users can result in more questions about data quality issues.
4) Successfully developing workable big data applications has a very low success rate and wastes many resources.
Panorama Necto uncovers the hidden insights in your data and presents them in beautiful dashboards powered with KPI Alerts, and is managed by the most secure, centralized & state of the art Business Intelligence.
Part of OFX Academy Course: Improving Line Performance
http://academy.optimumfx.com/course/improving-line-performance/
Improving Packaging Line Performance –Using the correct Data and Drill Down Analysis
Your AI and ML Projects Are Failing – Key Steps to Get Them Back on TrackPrecisely
With recent studies indicating that 80% of AI and machine learning projects are failing due to data quality related issues, it’s critical to think holistically about this fact. This is not a simple topic – issues in data quality can occur throughout from starting the project through to model implementation and usage.
View this webinar on-demand, where we start with four foundational data steps to get our AI and ML projects grounded and underway, specifically:
• Framing the business problem
• Identifying the “right” data to collect and work with
• Establishing baselines of data quality through data profiling and business rules
• Assessing fitness for purpose for training and evaluating the subsequent models and algorithms
You’re on your way to becoming a high-performance government. You’ve worked hard to accurately collect your assets and track your work—but now what? It’s time to take your Cartegraph data to the next level. See how your peers are using their data to answer questions, become a more productive organization, and prepare for the future.
This document provides guidance on selecting and constructing data collection instruments. It discusses different data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. Structured, semi-structured and unstructured methods are described for various tools like surveys, interviews, observation and participatory methods. Guidelines are provided for choosing the right approach based on evaluation needs and deciding how to record and analyze collected data.
This document provides guidance on selecting and constructing data collection instruments. It discusses different data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. Structured, semi-structured and unstructured methods are described for collecting data through tools like surveys, interviews, observation, documents and participatory methods. Guidelines are provided for choosing the appropriate data collection methods depending on the evaluation needs and context.
This document provides guidance on selecting and constructing data collection instruments for evaluations. It discusses various data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. A number of specific tools for collecting data are described, including structured and semi-structured surveys, interviews, focus groups, observation, diaries/journals, expert judgment, and the Delphi technique. Guidelines are provided for implementing each tool, along with their advantages and challenges. The goal is to help evaluators select the most appropriate data collection methods based on the information needs, resources, and context of the evaluation.
This document provides guidance on selecting and constructing data collection instruments for evaluations. It discusses various data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. A number of specific tools for collecting data are also described, including structured and semi-structured surveys, interviews, focus groups, observation, diaries/journals, expert judgment, and the Delphi technique. Guidelines are provided for effectively implementing each of these tools.
This document provides guidance on selecting and constructing data collection instruments. It discusses different data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. Structured, semi-structured and unstructured methods are described for collecting data through tools like surveys, interviews, observation, documents and participatory methods. Guidelines are provided for choosing the right approach based on the evaluation needs and ensuring quality in data collection.
The Evolution of Laboratory Data Systems: Replacing Paper, Streamlining Proce...IDBS
The laboratory has become an increasingly electronic environment. It’s not just that the volume of data is greater than ever before, it’s also being generated at ever-increasing speeds. As companies move towards a fully integrated lab environment there are benefits and pitfalls along the way. Successful projects start with a solid foundation, and keep a clear vision in mind.
Sqrrl's Director of Product Marketing, Joe Travaglini, shares some lessons learned about how to approach a "Big Data problem" with his 10 steps to building a Big App, and how to mobilize data-driven thinking into your line of business.
Foundational Strategies for Trust in Big Data Part 2: Understanding Your DataPrecisely
This document provides an overview of understanding data through data profiling. It discusses:
1. The five key steps to effective data profiling: defining how to analyze the data, what to review, what to look for, when to build rules, and what to communicate.
2. Common challenges with big data and new data types, and measurements for assessing data quality.
3. A case study of how British Airways leveraged data profiling and governance to ensure accurate customer data across multiple systems and improve analysis, marketing and service.
Curiosity Software and RCG Global Services Present - Solving Test Data: the g...Curiosity Software Ireland
This webinar was co-hosted by Curiosity and RCG Global Services on January 20th, 2022. Watch the webinar on demand: https://www.curiositysoftware.ie/solving-test-data-webinar
Outdated test data management practices are today a sinkhole for testing and development time. They stifle release velocity, risk costly legislative non-compliance, and yet still do not provide the data needed to protect releases from damaging bugs. To achieve true quality at speed, the test data paradigm must shift. Enterprises must move beyond slowly copying large sets of production data to a limited number of out-of-date test environments.
In this webinar, Global Head of Quality Engineering at RCG, Niko Mangahas, draws on extensive project experience to define the test data challenges facing enterprises today. He then helps you identify the right test data solution for your organisation, setting out principles for effective requirements gathering and program design. Niko then hands over to veteran test data inventor, Huw Price, who demoes some of the latest techniques for making complete and compliant data available on-the-fly during parallel testing, development, and CI/CD.
DATA SCIENCE AND BIG DATA ANALYTICSCHAPTER 2 DATA ANA.docxrandyburney60861
DATA SCIENCE AND BIG DATA
ANALYTICS
CHAPTER 2:
DATA ANALYTICS LIFECYCLE
DATA ANALYTICS LIFECYCLE
• Data science projects differ from BI projects
• More exploratory in nature
• Critical to have a project process
• Participants should be thorough and rigorous
• Break large projects into smaller pieces
• Spend time to plan and scope the work
• Documenting adds rigor and credibility
DATA ANALYTICS LIFECYCLE
• Data Analytics Lifecycle Overview
• Phase 1: Discovery
• Phase 2: Data Preparation
• Phase 3: Model Planning
• Phase 4: Model Building
• Phase 5: Communicate Results
• Phase 6: Operationalize
• Case Study: GINA
2.1 DATA ANALYTICS
LIFECYCLE OVERVIEW
• The data analytic lifecycle is designed for Big Data problems and
data science projects
• With six phases the project work can occur in several phases
simultaneously
• The cycle is iterative to portray a real project
• Work can return to earlier phases as new information is uncovered
2.1.1 KEY ROLES FOR A
SUCCESSFUL ANALYTICS
PROJECT
KEY ROLES FOR A
SUCCESSFUL ANALYTICS
PROJECT
• Business User – understands the domain area
• Project Sponsor – provides requirements
• Project Manager – ensures meeting objectives
• Business Intelligence Analyst – provides business domain
expertise based on deep understanding of the data
• Database Administrator (DBA) – creates DB environment
• Data Engineer – provides technical skills, assists data
management and extraction, supports analytic sandbox
• Data Scientist – provides analytic techniques and modeling
2.1.2 BACKGROUND AND OVERVIEW
OF DATA ANALYTICS LIFECYCLE
• Data Analytics Lifecycle defines the analytics process and
best practices from discovery to project completion
• The Lifecycle employs aspects of
• Scientific method
• Cross Industry Standard Process for Data Mining (CRISP-DM)
• Process model for data mining
• Davenport’s DELTA framework
• Hubbard’s Applied Information Economics (AIE) approach
• MAD Skills: New Analysis Practices for Big Data by Cohen et al.
https://en.wikipedia.org/wiki/Scientific_method
https://en.wikipedia.org/wiki/Cross_Industry_Standard_Process_for_Data_Mining
http://www.informationweek.com/software/information-management/analytics-at-work-qanda-with-tom-davenport/d/d-id/1085869?
https://en.wikipedia.org/wiki/Applied_information_economics
https://pafnuty.wordpress.com/2013/03/15/reading-log-mad-skills-new-analysis-practices-for-big-data-cohen/
OVERVIEW OF
DATA ANALYTICS LIFECYCLE
2.2 PHASE 1: DISCOVERY
2.2 PHASE 1: DISCOVERY
1. Learning the Business Domain
2. Resources
3. Framing the Problem
4. Identifying Key Stakeholders
5. Interviewing the Analytics Sponsor
6. Developing Initial Hypotheses
7. Identifying Potential Data Sources
2.3 PHASE 2: DATA PREPARATION
2.3 PHASE 2: DATA
PREPARATION
• Includes steps to explore, preprocess, and condition
data
• Create robust environment – analytics sandbox
• Data preparation tends to be t.
Outsmarting Traffic: Integrating Waze Into Your Maps and AppsCartegraph
Your commuters are frustrated: after all, they’re wasting 41 hours a year in traffic on average. It’s time to help drivers outsmart traffic with real-time, crowd-sourced information. Get an inside look at the City of Colorado Springs Cone Zone Program and see how they’re integrating ArcGIS web maps, asset management data, and Waze to streamline their operations and provide drivers with mobile traffic and navigation alerts.
Check out this PPT and learn how Andy's team is:
- Saving 286 hours, or $11K, in staff time with real-time work and asset data
- Rerouting drivers around closures, congestion, and special events
- Publicly sharing the location of new and ongoing construction projects
You’ve talked the high-performance talk, now it’s time to walk the high-performance walk. Addressing nearly 100 votes, the Cartegraph Winter 2019 release is here to help your team monitor key performance indicators, dig deeper into dashboard data, track work more efficiently with labor crews, and better justify your maintenance plans to council.
Watch this webinar to learn more about:
- Evaluating your progress with a boatload of new default dashboard gadgets
- Using Labor Crews to quickly assign tasks to a team and split the resources
- Visualizing Scenario Builder results on a map by plan year and activity
- Searching Campus content without leaving your Cartegraph browser
More Related Content
Similar to The 6 Common Data Collection Mistakes You May Be Making
This document provides an agenda for a presentation on best practices for data collection and use. It discusses collecting the right types of data, maintaining data quality, using data for wealth screenings and segmentation, and addressing obstacles like cost, return on investment, and data management challenges. Case studies are presented on how different organizations have leveraged data collection and analysis. The importance of big data and using available online data is also covered.
NTEN Webinar - Data Cleaning and Visualization Tools for NonprofitsAzavea
Slides from a webinar we conducted for NTEN that covers tools that nonprofits can use to clean and prepare their datasets and then visualize them via charts, maps, and graphs.
Bringing agility to big data applications can be challenging for several reasons:
1) There is a very long feedback cycle when developing workable software applications due to the time needed to wait for quality data.
2) Switching costs between tasks and projects are high for big data applications.
3) Releasing features to all users can result in more questions about data quality issues.
4) Successfully developing workable big data applications has a very low success rate and wastes many resources.
Panorama Necto uncovers the hidden insights in your data and presents them in beautiful dashboards powered with KPI Alerts, and is managed by the most secure, centralized & state of the art Business Intelligence.
Part of OFX Academy Course: Improving Line Performance
http://academy.optimumfx.com/course/improving-line-performance/
Improving Packaging Line Performance –Using the correct Data and Drill Down Analysis
Your AI and ML Projects Are Failing – Key Steps to Get Them Back on TrackPrecisely
With recent studies indicating that 80% of AI and machine learning projects are failing due to data quality related issues, it’s critical to think holistically about this fact. This is not a simple topic – issues in data quality can occur throughout from starting the project through to model implementation and usage.
View this webinar on-demand, where we start with four foundational data steps to get our AI and ML projects grounded and underway, specifically:
• Framing the business problem
• Identifying the “right” data to collect and work with
• Establishing baselines of data quality through data profiling and business rules
• Assessing fitness for purpose for training and evaluating the subsequent models and algorithms
You’re on your way to becoming a high-performance government. You’ve worked hard to accurately collect your assets and track your work—but now what? It’s time to take your Cartegraph data to the next level. See how your peers are using their data to answer questions, become a more productive organization, and prepare for the future.
This document provides guidance on selecting and constructing data collection instruments. It discusses different data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. Structured, semi-structured and unstructured methods are described for various tools like surveys, interviews, observation and participatory methods. Guidelines are provided for choosing the right approach based on evaluation needs and deciding how to record and analyze collected data.
This document provides guidance on selecting and constructing data collection instruments. It discusses different data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. Structured, semi-structured and unstructured methods are described for collecting data through tools like surveys, interviews, observation, documents and participatory methods. Guidelines are provided for choosing the appropriate data collection methods depending on the evaluation needs and context.
This document provides guidance on selecting and constructing data collection instruments for evaluations. It discusses various data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. A number of specific tools for collecting data are described, including structured and semi-structured surveys, interviews, focus groups, observation, diaries/journals, expert judgment, and the Delphi technique. Guidelines are provided for implementing each tool, along with their advantages and challenges. The goal is to help evaluators select the most appropriate data collection methods based on the information needs, resources, and context of the evaluation.
This document provides guidance on selecting and constructing data collection instruments for evaluations. It discusses various data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. A number of specific tools for collecting data are also described, including structured and semi-structured surveys, interviews, focus groups, observation, diaries/journals, expert judgment, and the Delphi technique. Guidelines are provided for effectively implementing each of these tools.
This document provides guidance on selecting and constructing data collection instruments. It discusses different data collection strategies and characteristics of good measures. Both quantitative and qualitative approaches are covered. Structured, semi-structured and unstructured methods are described for collecting data through tools like surveys, interviews, observation, documents and participatory methods. Guidelines are provided for choosing the right approach based on the evaluation needs and ensuring quality in data collection.
The Evolution of Laboratory Data Systems: Replacing Paper, Streamlining Proce...IDBS
The laboratory has become an increasingly electronic environment. It’s not just that the volume of data is greater than ever before, it’s also being generated at ever-increasing speeds. As companies move towards a fully integrated lab environment there are benefits and pitfalls along the way. Successful projects start with a solid foundation, and keep a clear vision in mind.
Sqrrl's Director of Product Marketing, Joe Travaglini, shares some lessons learned about how to approach a "Big Data problem" with his 10 steps to building a Big App, and how to mobilize data-driven thinking into your line of business.
Foundational Strategies for Trust in Big Data Part 2: Understanding Your DataPrecisely
This document provides an overview of understanding data through data profiling. It discusses:
1. The five key steps to effective data profiling: defining how to analyze the data, what to review, what to look for, when to build rules, and what to communicate.
2. Common challenges with big data and new data types, and measurements for assessing data quality.
3. A case study of how British Airways leveraged data profiling and governance to ensure accurate customer data across multiple systems and improve analysis, marketing and service.
Curiosity Software and RCG Global Services Present - Solving Test Data: the g...Curiosity Software Ireland
This webinar was co-hosted by Curiosity and RCG Global Services on January 20th, 2022. Watch the webinar on demand: https://www.curiositysoftware.ie/solving-test-data-webinar
Outdated test data management practices are today a sinkhole for testing and development time. They stifle release velocity, risk costly legislative non-compliance, and yet still do not provide the data needed to protect releases from damaging bugs. To achieve true quality at speed, the test data paradigm must shift. Enterprises must move beyond slowly copying large sets of production data to a limited number of out-of-date test environments.
In this webinar, Global Head of Quality Engineering at RCG, Niko Mangahas, draws on extensive project experience to define the test data challenges facing enterprises today. He then helps you identify the right test data solution for your organisation, setting out principles for effective requirements gathering and program design. Niko then hands over to veteran test data inventor, Huw Price, who demoes some of the latest techniques for making complete and compliant data available on-the-fly during parallel testing, development, and CI/CD.
DATA SCIENCE AND BIG DATA ANALYTICSCHAPTER 2 DATA ANA.docxrandyburney60861
DATA SCIENCE AND BIG DATA
ANALYTICS
CHAPTER 2:
DATA ANALYTICS LIFECYCLE
DATA ANALYTICS LIFECYCLE
• Data science projects differ from BI projects
• More exploratory in nature
• Critical to have a project process
• Participants should be thorough and rigorous
• Break large projects into smaller pieces
• Spend time to plan and scope the work
• Documenting adds rigor and credibility
DATA ANALYTICS LIFECYCLE
• Data Analytics Lifecycle Overview
• Phase 1: Discovery
• Phase 2: Data Preparation
• Phase 3: Model Planning
• Phase 4: Model Building
• Phase 5: Communicate Results
• Phase 6: Operationalize
• Case Study: GINA
2.1 DATA ANALYTICS
LIFECYCLE OVERVIEW
• The data analytic lifecycle is designed for Big Data problems and
data science projects
• With six phases the project work can occur in several phases
simultaneously
• The cycle is iterative to portray a real project
• Work can return to earlier phases as new information is uncovered
2.1.1 KEY ROLES FOR A
SUCCESSFUL ANALYTICS
PROJECT
KEY ROLES FOR A
SUCCESSFUL ANALYTICS
PROJECT
• Business User – understands the domain area
• Project Sponsor – provides requirements
• Project Manager – ensures meeting objectives
• Business Intelligence Analyst – provides business domain
expertise based on deep understanding of the data
• Database Administrator (DBA) – creates DB environment
• Data Engineer – provides technical skills, assists data
management and extraction, supports analytic sandbox
• Data Scientist – provides analytic techniques and modeling
2.1.2 BACKGROUND AND OVERVIEW
OF DATA ANALYTICS LIFECYCLE
• Data Analytics Lifecycle defines the analytics process and
best practices from discovery to project completion
• The Lifecycle employs aspects of
• Scientific method
• Cross Industry Standard Process for Data Mining (CRISP-DM)
• Process model for data mining
• Davenport’s DELTA framework
• Hubbard’s Applied Information Economics (AIE) approach
• MAD Skills: New Analysis Practices for Big Data by Cohen et al.
https://en.wikipedia.org/wiki/Scientific_method
https://en.wikipedia.org/wiki/Cross_Industry_Standard_Process_for_Data_Mining
http://www.informationweek.com/software/information-management/analytics-at-work-qanda-with-tom-davenport/d/d-id/1085869?
https://en.wikipedia.org/wiki/Applied_information_economics
https://pafnuty.wordpress.com/2013/03/15/reading-log-mad-skills-new-analysis-practices-for-big-data-cohen/
OVERVIEW OF
DATA ANALYTICS LIFECYCLE
2.2 PHASE 1: DISCOVERY
2.2 PHASE 1: DISCOVERY
1. Learning the Business Domain
2. Resources
3. Framing the Problem
4. Identifying Key Stakeholders
5. Interviewing the Analytics Sponsor
6. Developing Initial Hypotheses
7. Identifying Potential Data Sources
2.3 PHASE 2: DATA PREPARATION
2.3 PHASE 2: DATA
PREPARATION
• Includes steps to explore, preprocess, and condition
data
• Create robust environment – analytics sandbox
• Data preparation tends to be t.
Outsmarting Traffic: Integrating Waze Into Your Maps and AppsCartegraph
Your commuters are frustrated: after all, they’re wasting 41 hours a year in traffic on average. It’s time to help drivers outsmart traffic with real-time, crowd-sourced information. Get an inside look at the City of Colorado Springs Cone Zone Program and see how they’re integrating ArcGIS web maps, asset management data, and Waze to streamline their operations and provide drivers with mobile traffic and navigation alerts.
Check out this PPT and learn how Andy's team is:
- Saving 286 hours, or $11K, in staff time with real-time work and asset data
- Rerouting drivers around closures, congestion, and special events
- Publicly sharing the location of new and ongoing construction projects
You’ve talked the high-performance talk, now it’s time to walk the high-performance walk. Addressing nearly 100 votes, the Cartegraph Winter 2019 release is here to help your team monitor key performance indicators, dig deeper into dashboard data, track work more efficiently with labor crews, and better justify your maintenance plans to council.
Watch this webinar to learn more about:
- Evaluating your progress with a boatload of new default dashboard gadgets
- Using Labor Crews to quickly assign tasks to a team and split the resources
- Visualizing Scenario Builder results on a map by plan year and activity
- Searching Campus content without leaving your Cartegraph browser
Getting Serious About Reaching Your GoalsCartegraph
Did you know 92 percent of people that set New Year’s goals never actually achieve them? If you’re ready to break away from the pack and join the 8 percent of high achievers, then this webinar is for you. Uncover some of the best goal-setting techniques, learn how to effectively test your ideas, and help your local government team produce more measurable, actionable results in 2019.
Improve your success rate by setting SMART and CLEAR goals
Track and measure your progress with simple, real-time dashboards
Motivate your team and turn scattered energy into laser focus
Use pilot programs to your advantage and minimize your risk for innovation
The document summarizes the new features and enhancements in Cartegraph's Fall 2018 release. Key additions include the ability to lock down the system and notify users, new record and dashboard gadget types for analysis, scenario budget categories to apply activities to specific budgets, and performance improvements to the Scenario Builder. The release also features updates to the user interface, support for newer operating systems, and an improved online campus.
Citizen Requests: There Has To Be A Better WayCartegraph
We live in a mobile-first, tell-me-now world. Your citizens expect daily updates and a quick fix when they submit maintenance requests. But, with tight budgets and staff cuts—how are you expected to keep up? Hear how public works crews from coast to coast are saving time on repairs and keeping residents informed.
This PDF will teach you how your team can:
- Improve communication and manage citizen expectations with automated workflows.
- Leverage mobile technology to track assets, assign work, and monitor costs.
- Evaluate your current processes and justify the need for updates and improvements.
As we roll into the final days of summer, we’re excited to share the new features included in the Cartegraph Summer 2018 release. From a new Analytics Dashboard gadget and library search capabilities to increased task creation limits and interactive work order maps, this release is all about boosting the efficiency and effectiveness of your day.
Save your free webinar spot now to learn more about:
- Visualizing data on the dashboard from a user-configurable list
- Performing more actions at once quicker than ever before
- Editing records in a snap with our new library search bar
- Adding greater flexibility and accuracy to your Scenario Builder projections
- Analyzing work orders more effectively
50x the Budget in 2 Years: A Public Works Success StoryCartegraph
Learn how small improvements in asset and work management have led to big changes for the City of Helotes, TX. Public Works Supervisor Josh Mair will share how his team quickly evolved from a pen-and-paper approach to tracking infrastructure in real time, answering council questions in a flash, and using data to justify new hires and budget increases.
Lessons learned surrounding asset data collection
Tips for using operational data to your advantage
Best practices for starting your high-performance journey today
Spring Release 2018: What's New
Now that we’re in full spring thaw, we're excited to present on the exciting new features included in the Cartegraph Spring 2018 release. This release is focused on making you more productive in your operations and includes many new features including some of our most requested client ideas.
Save your free webinar spot now to learn more about:
- The new basic report designer so that you can build quick, simplified tables and charts
- Embedding maps into reports and print
- Quickly searching through records with enhanced record searching
OH Senate Bill 2: What You Need to Know Cartegraph
In less than 6 months, your water utility must comply with the standards outlined in Ohio Senate Bill 2. Does your public water system have an asset management program in place that meets the required technical, financial, and managerial capabilities? Or are you racing against the clock?
- Get up to speed on the latest public water treatment system asset management requirements
- Uncover the draft administrative rules for asset management in Ohio
- Learn how to leverage mobile technology to facilitate compliance
Green Alleys: An Innovative Approach to Stormwater ManagementCartegraph
This document discusses the history and development of a new technology over many years through trial and error by a research team. It describes several key milestones and discoveries they made along the way as well as challenges they faced in developing a working prototype. After many iterations and improvements, the team was finally able to create a successful prototype that demonstrated the potential of this new technology.
This document is a design review checklist for a permeable pavement project in three alleys in Dubuque, Iowa. It provides details on the project location, drainage area characteristics, stormwater calculations, pavement and base materials, subgrade and foundation protection, construction details, and erosion control measures. Key information includes the project will treat runoff from 3.05 acres of 60% impervious area, include interlocking concrete pavers over gravel bases totaling over 21,000 cubic feet of storage, and tie into existing storm intakes with overflow draining longitudinally down the alleys.
3 Tips to Master Storytelling in Local GovCartegraph
This document provides tips for using storytelling in local government. It recommends collecting data and converting it into meaningful information to inform decisions. Leaders should then candidly share the facts with passion but without bias. The key is connecting the dots between facts and telling a compelling story that engages people. Stories should be communicated plainly without jargon. Facts still matter, and doing nothing can have consequences, so sharing facts through stories can help end apathy.
This document provides an overview of security settings and configuration for a Cartegraph OMS site. It discusses creating new users and roles, OMS security settings, layouts, library filters, and using ArcGIS Online to control map access. The document explains that security is managed by roles, how to add new users and roles, customize roles, and add filters to libraries. It also notes that security is a combination of role settings, layout assignments, library filters, and map settings, and that default roles should not be used for production level users.
Proactively managing your assets saves money, makes the community safer, and aids in planning. Harness the power of Cartegraph OMS’ powerful pro-active work and asset management features. Combine the proactive work tools to turn the tide from reactive to proactive operations management.
Cartegraph OMS is built to store, share, and apply data to enhance your community’s performance. Integrate with other programs that gather asset condition, location, or cost information to improve the quality of your data.
The data collected is so import for day-to-day operations and decision making. As Administrator, you are responsible for keeping Cartegraph running. This session focuses on Administration settings that shape how system data is created and data integrity is maintained.
Measuring the Success of Your Strategic Plan Cartegraph
The city of Fort Collins has been working to improve its strategic planning and performance measurement processes over the past decade. It began publishing annual performance reports in 2009-2011 and launched a community dashboard in 2013 to provide transparency. The city completed its first strategic plan in 2014 and now reviews performance on the plan quarterly using strategy maps. The processes have evolved through a continuous improvement approach, with the goal of using data and metrics to demonstrate progress on strategic objectives and ensure resources are effectively allocated to achieve desired outcomes for the community.
Empowering Data-Driven Innovation in GovernmentCartegraph
Join Scotty Martin, founding architect of Denver’s Peak Academy, to learn how to invest in, brand, and position your own internal innovation team. During this discussion, you’ll zero in on what a customer-centric culture is, and how to foster and exploit that culture to drive high performance in your organization. Additionally, you’ll discover how tools like data visualization and process improvement can empower your team to deliver.
Sponsored by ELGL, this session was part of the Cartegraph National Conference/Rocky Mountain High-Performance Government experience.
Rain Check: Preparing Your Stormwater System for Historic FloodingCartegraph
Is your stormwater system prepared to handle a year’s worth of rainfall in less than a week? That was the test for the City of Golden in 2013. Thanks to a proactive stormwater maintenance routine and reliable GIS data, the City incurred minimal damage. And since they were able to provide detailed reports on recovery efforts and costs, they received FEMA reimbursement within months.
Fleet First: How to Manage Your Fleet EffectivelyCartegraph
Smart fleet management is the byproduct of patience, planning, and execution. Regular, proactive monitoring is the key to working efficiently and the easiest way to ensure that the fleet performs to your satisfaction. Learn how you can start tracking your fleet effectively in OMS and streamline your fleet maintenance.
How To Cultivate Community Affinity Throughout The Generosity JourneyAggregage
This session will dive into how to create rich generosity experiences that foster long-lasting relationships. You’ll walk away with actionable insights to redefine how you engage with your supporters — emphasizing trust, engagement, and community!
Contributi dei parlamentari del PD - Contributi L. 3/2019Partito democratico
DI SEGUITO SONO PUBBLICATI, AI SENSI DELL'ART. 11 DELLA LEGGE N. 3/2019, GLI IMPORTI RICEVUTI DALL'ENTRATA IN VIGORE DELLA SUDDETTA NORMA (31/01/2019) E FINO AL MESE SOLARE ANTECEDENTE QUELLO DELLA PUBBLICAZIONE SUL PRESENTE SITO
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
RFP for Reno's Community Assistance CenterThis Is Reno
Property appraisals completed in May for downtown Reno’s Community Assistance and Triage Centers (CAC) reveal that repairing the buildings to bring them back into service would cost an estimated $10.1 million—nearly four times the amount previously reported by city staff.
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
This report explores the significance of border towns and spaces for strengthening responses to young people on the move. In particular it explores the linkages of young people to local service centres with the aim of further developing service, protection, and support strategies for migrant children in border areas across the region. The report is based on a small-scale fieldwork study in the border towns of Chipata and Katete in Zambia conducted in July 2023. Border towns and spaces provide a rich source of information about issues related to the informal or irregular movement of young people across borders, including smuggling and trafficking. They can help build a picture of the nature and scope of the type of movement young migrants undertake and also the forms of protection available to them. Border towns and spaces also provide a lens through which we can better understand the vulnerabilities of young people on the move and, critically, the strategies they use to navigate challenges and access support.
The findings in this report highlight some of the key factors shaping the experiences and vulnerabilities of young people on the move – particularly their proximity to border spaces and how this affects the risks that they face. The report describes strategies that young people on the move employ to remain below the radar of visibility to state and non-state actors due to fear of arrest, detention, and deportation while also trying to keep themselves safe and access support in border towns. These strategies of (in)visibility provide a way to protect themselves yet at the same time also heighten some of the risks young people face as their vulnerabilities are not always recognised by those who could offer support.
In this report we show that the realities and challenges of life and migration in this region and in Zambia need to be better understood for support to be strengthened and tuned to meet the specific needs of young people on the move. This includes understanding the role of state and non-state stakeholders, the impact of laws and policies and, critically, the experiences of the young people themselves. We provide recommendations for immediate action, recommendations for programming to support young people on the move in the two towns that would reduce risk for young people in this area, and recommendations for longer term policy advocacy.
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
The 6 Common Data Collection Mistakes You May Be Making
1. THE 6 COMMON DATA COLLECTION
MISTAKES YOU MAY BE MAKING
NATHAN KEBEDE, P.E.
Pavement Engineer, Cartegraph
2. THE 6 COMMON DATA COLLECTION MISTAKES
YOU MAY BE MAKING
Outline
• Introduction to asset data collection
• The 6 common mistakes
• Summary
• Q&A
3. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
Infrastructure Assets: The Basics
• Government manages many assets used by the public
• Design, build, maintain, rehab, replace
4. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
Asset Data Collection: The Basics
• Good asset data is needed to effectively and efficiently
manage all these assets
• The most important data types:
• Type of asset, location, condition
• Without this data, effective and efficient management of
these assets is almost impossible
5. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
Asset Data Collection: The Benefits
• Improves operational effectiveness
• Boosts efficiency
• Facilitates forecasting
• Sharpens decision making
• Results in a High-Performance
team/department/organization
6. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
Common Methods of Asset Data Collection
• Manual method—boots on the ground
• Automated
• Aerial imagery
• Right-of-way imagery
• LiDAR point-cloud
7. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
Common Methods of Asset Data Collection
Manual
• Boots on the ground
• Could be done in-house
• Relatively slow
• Good for hard-to-reach
areas
Automated
• Use of vehicles or planes
(drones)
• Aerial imagery, right-of-way
imagery, LiDAR point-cloud
• Relatively fast
• Outsourced
• Post-processing needed
9. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
#1: Failing to Set Goals
• Define specifically:
• What type of data should be collected?
• What format should the data be in?
• How will the data be used?
• Set goals before starting data collection
10. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
#2: Choosing Quantity over Quality
• Examples of quantity over quality:
• Collecting signs but not identifying sign types
• Collecting asset with poor geospatial data
• Never compromise on quality for the sake of quantity
• Implement QA/QC measures
• Identify the quality level you desire for each data type collected
11. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
#3: Creating Data Silos
• Examples that lead to data silos:
• Using proprietary systems that do not easily integrate with other
systems
• Lack of communication between departments that leads to the same
data being collected multiple times
• Make data easily accessible and shared
• Communicate with other departments using the same data
12. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
#4: Relying on Human Memory
• Examples of relying on memory:
• A person with 20+ years in the department has all the information
• Human memory may forget details, or the person may leave
• Having data updated regularly and readily available across
the team is more sustainable
13. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
#5: Overlooking Data You Already Have
• Examples of overlooking data:
• Old paper records stored in filing cabinets
• Data other teams may have about the asset
• Do a thorough records review to avoid duplicating efforts
• Identifying your data gaps leads to a more effective data
collection
14. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
#6: Trying to Do It Alone
• Most agencies underestimate the effort it takes
• There are firms that do data collection full time
• Have more domain expertise
• Are more efficient and effective
15. THE 6 COMMON DATA COLLECTION MISTAKES YOU MAY BE MAKING
Summary
1. Always set goals before data collection
2. Choose quality over quantity
3. Make data accessible
4. Don’t rely on human memory
5. Determine data gaps prior to data collection
6. Don’t go at it alone