All organizations are increasingly data-hungry but often have to deal with very poor data quality. At DC Public Schools we took a realistic, pragmatic approach to building a data reporting infrastructure that met the organization where it was today and built a pathway to where it needs to be tomorrow
Project Management and Technology don't matterAndrew Patricio
Implemented new Student Information System on-time, under-budget, and very successfully by focusing on what is really important instead of what is normally thought of as important:
1. Managing failure rather than success
2. DCPS in house technical expertise and leadership
3. Strong focus on people rather than technology
Learn why Solution Design is critical and what are components of a Solution Architecture. Boston Technology Corporation (BTC) has expertise in Strategic Consulting and Solution Design Services. Visit our website to see some of our work at http://www.boston-technology.com/
In this advanced business analysis training session, you will learn RPA. Topics covered in this session are:
• What is RPA?
• Making Office Productive
• Consequences
• Automation
For more information, click here: https://www.mindsmapped.com/courses/business-analysis/advanced-business-analyst-training/
Digital strategy is a statement about the organisation’s digital positioning, competitors and customer and collaborator needs and behaviour to achieve a direction for innovation, communication, transaction and promotion.
This describes facets of exploring the options for digital to ensure that the resulting strategy is realistic, achievable and will deliver a return.
Enterprise Architecture needs to be involved in the development of digital architecture. Digital architecture needs to be at the core of the organisation’s wider Enterprise Architecture.
Technology generally accelerates existing business momentum rather than being the originator of momentum. Digital is not a panacea. Digital interactions with third parties gives rise to expectations
Digital will make weaknesses in business processes and underlying technology very evident very quickly. Iterate through digital initiatives, starting small and focussed, learning from experience.
RPA (Robotic Process Automation), POA (Process Oriented Architecture) And BPM...Alan McSweeney
RPA (Robotic Process Automation) is an opportunity to add value by creating (partially of completely) automated meta processes that control one or more existing applications to automate the interactions with those applications and thus enable the successful operation of the process.
RPA can reduce manual effort, reduce manuals errors, improve quality, accuracy and ensure consistency. RPA based processes are always available, can respond to changes more quickly and are more scalable that manual processes. They captures process information for reporting, analysis and process improvement and provide greater visibility and control.
Successful RPA is a pre-requisite to exploiting other technologies and approaches such as artificial intelligence.
POA (Process Oriented Architecture) is concerned with linking process areas to actual (desired) interactions – customer (external interacting party) service journeys through the organisation.
BPM (Business Process Management) is the disciplined approach to identify, design, execute, document, measure, monitor and control both automated and non-automated business processes to achieve consistent, targeted results aligned with an organisation’s strategic goals.
Increasing velocity of change means that informal, undocumented expertise makes reaction slow, exceptions are only known and understood locally – process architecture ensures knowledge is documented and change can happen quickly.
A change to digital operations means that internal processes are exposed – the potentially inefficient and manual processes must be made efficient and external interactions must be masked from the internal complexity.
Moving the organisation from one that is internally focussed around its siloed structures to one that is focussed on customer (external interacting party) straight-through interactions.
Automating existing processes requires a structured approach to process analysis.
A structured approach to designing new optimised processes is important to successful RPA implementation.
Review of Information Technology Function Critical Capability ModelsAlan McSweeney
IT Function critical capabilities are key areas where the IT function needs to maintain significant levels of competence, skill and experience and practise in order to operate and deliver a service. There are several different IT capability frameworks. The objective of these notes is to assess the suitability and applicability of these frameworks. These models can be used to identify what is important for your IT function based on your current and desired/necessary activity profile.
Capabilities vary across organisation – not all capabilities have the same importance for all organisations. These frameworks do not readily accommodate variability in the relative importance of capabilities.
The assessment approach taken is to identify a generalised set of capabilities needed across the span of IT function operations, from strategy to operations and delivery. This generic model is then be used to assess individual frameworks to determine their scope and coverage and to identify gaps.
The generic IT function capability model proposed here consists of five groups or domains of major capabilities that can be organised across the span of the IT function:
1. Information Technology Strategy, Management and Governance
2. Technology and Platforms Standards Development and Management
3. Technology and Solution Consulting and Delivery
4. Operational Run The Business/Business as Usual/Service Provision
5. Change The Business/Development and Introduction of New Services
In the context of trends and initiatives such as outsourcing, transition to cloud services and greater platform-based offerings, should the IT function develop and enhance its meta-capabilities – the management of the delivery of capabilities? Is capability identification and delivery management the most important capability? Outsourced service delivery in all its forms is not a fire-and-forget activity. You can outsource the provision of any service except the management of the supply of that service.
The following IT capability models have been evaluated:
• IT4IT Reference Architecture https://www.opengroup.org/it4it contains 32 functional components
• European e-Competence Framework (ECF) http://www.ecompetences.eu/ contains 40 competencies
• ITIL V4 https://www.axelos.com/best-practice-solutions/itil has 34 management practices
• COBIT 2019 https://www.isaca.org/resources/cobit has 40 management and control processes
• APQC Process Classification Framework - https://www.apqc.org/process-performance-management/process-frameworks version 7.2.1 has 44 major IT management processes
• IT Capability Maturity Framework (IT-CMF) https://ivi.ie/critical-capabilities/ contains 37 critical capabilities
The following model has not been evaluated
• Skills Framework for the Information Age (SFIA) - http://www.sfia-online.org/ lists over 100 skills
Review existing data management maturity models to identify core set of characteristics of an effective data maturity model:
DMBOK (Data Management Book of Knowledge) from DAMA (Data Management Association)
MIKE2.0 (Method for an Integrated Knowledge Environment) Information Maturity Model (IMM)
IBM Data Governance Council Maturity Model
Enterprise Data Management Council Data Management Maturity Model
Project Management and Technology don't matterAndrew Patricio
Implemented new Student Information System on-time, under-budget, and very successfully by focusing on what is really important instead of what is normally thought of as important:
1. Managing failure rather than success
2. DCPS in house technical expertise and leadership
3. Strong focus on people rather than technology
Learn why Solution Design is critical and what are components of a Solution Architecture. Boston Technology Corporation (BTC) has expertise in Strategic Consulting and Solution Design Services. Visit our website to see some of our work at http://www.boston-technology.com/
In this advanced business analysis training session, you will learn RPA. Topics covered in this session are:
• What is RPA?
• Making Office Productive
• Consequences
• Automation
For more information, click here: https://www.mindsmapped.com/courses/business-analysis/advanced-business-analyst-training/
Digital strategy is a statement about the organisation’s digital positioning, competitors and customer and collaborator needs and behaviour to achieve a direction for innovation, communication, transaction and promotion.
This describes facets of exploring the options for digital to ensure that the resulting strategy is realistic, achievable and will deliver a return.
Enterprise Architecture needs to be involved in the development of digital architecture. Digital architecture needs to be at the core of the organisation’s wider Enterprise Architecture.
Technology generally accelerates existing business momentum rather than being the originator of momentum. Digital is not a panacea. Digital interactions with third parties gives rise to expectations
Digital will make weaknesses in business processes and underlying technology very evident very quickly. Iterate through digital initiatives, starting small and focussed, learning from experience.
RPA (Robotic Process Automation), POA (Process Oriented Architecture) And BPM...Alan McSweeney
RPA (Robotic Process Automation) is an opportunity to add value by creating (partially of completely) automated meta processes that control one or more existing applications to automate the interactions with those applications and thus enable the successful operation of the process.
RPA can reduce manual effort, reduce manuals errors, improve quality, accuracy and ensure consistency. RPA based processes are always available, can respond to changes more quickly and are more scalable that manual processes. They captures process information for reporting, analysis and process improvement and provide greater visibility and control.
Successful RPA is a pre-requisite to exploiting other technologies and approaches such as artificial intelligence.
POA (Process Oriented Architecture) is concerned with linking process areas to actual (desired) interactions – customer (external interacting party) service journeys through the organisation.
BPM (Business Process Management) is the disciplined approach to identify, design, execute, document, measure, monitor and control both automated and non-automated business processes to achieve consistent, targeted results aligned with an organisation’s strategic goals.
Increasing velocity of change means that informal, undocumented expertise makes reaction slow, exceptions are only known and understood locally – process architecture ensures knowledge is documented and change can happen quickly.
A change to digital operations means that internal processes are exposed – the potentially inefficient and manual processes must be made efficient and external interactions must be masked from the internal complexity.
Moving the organisation from one that is internally focussed around its siloed structures to one that is focussed on customer (external interacting party) straight-through interactions.
Automating existing processes requires a structured approach to process analysis.
A structured approach to designing new optimised processes is important to successful RPA implementation.
Review of Information Technology Function Critical Capability ModelsAlan McSweeney
IT Function critical capabilities are key areas where the IT function needs to maintain significant levels of competence, skill and experience and practise in order to operate and deliver a service. There are several different IT capability frameworks. The objective of these notes is to assess the suitability and applicability of these frameworks. These models can be used to identify what is important for your IT function based on your current and desired/necessary activity profile.
Capabilities vary across organisation – not all capabilities have the same importance for all organisations. These frameworks do not readily accommodate variability in the relative importance of capabilities.
The assessment approach taken is to identify a generalised set of capabilities needed across the span of IT function operations, from strategy to operations and delivery. This generic model is then be used to assess individual frameworks to determine their scope and coverage and to identify gaps.
The generic IT function capability model proposed here consists of five groups or domains of major capabilities that can be organised across the span of the IT function:
1. Information Technology Strategy, Management and Governance
2. Technology and Platforms Standards Development and Management
3. Technology and Solution Consulting and Delivery
4. Operational Run The Business/Business as Usual/Service Provision
5. Change The Business/Development and Introduction of New Services
In the context of trends and initiatives such as outsourcing, transition to cloud services and greater platform-based offerings, should the IT function develop and enhance its meta-capabilities – the management of the delivery of capabilities? Is capability identification and delivery management the most important capability? Outsourced service delivery in all its forms is not a fire-and-forget activity. You can outsource the provision of any service except the management of the supply of that service.
The following IT capability models have been evaluated:
• IT4IT Reference Architecture https://www.opengroup.org/it4it contains 32 functional components
• European e-Competence Framework (ECF) http://www.ecompetences.eu/ contains 40 competencies
• ITIL V4 https://www.axelos.com/best-practice-solutions/itil has 34 management practices
• COBIT 2019 https://www.isaca.org/resources/cobit has 40 management and control processes
• APQC Process Classification Framework - https://www.apqc.org/process-performance-management/process-frameworks version 7.2.1 has 44 major IT management processes
• IT Capability Maturity Framework (IT-CMF) https://ivi.ie/critical-capabilities/ contains 37 critical capabilities
The following model has not been evaluated
• Skills Framework for the Information Age (SFIA) - http://www.sfia-online.org/ lists over 100 skills
Review existing data management maturity models to identify core set of characteristics of an effective data maturity model:
DMBOK (Data Management Book of Knowledge) from DAMA (Data Management Association)
MIKE2.0 (Method for an Integrated Knowledge Environment) Information Maturity Model (IMM)
IBM Data Governance Council Maturity Model
Enterprise Data Management Council Data Management Maturity Model
Big Data, Big Problems: Avoid System Failure with Quality Analysis - Webinar ...CAST
Do you want to make your systems more reliable and resilient before your organization becomes the next headline? View the slides from our recent webinar with Melinda Ballou, Program Director for IDC's Application Life-Cycle Management & Executive Strategies research.
Melinda discusses the trends driving recent high-profile outages with increasing frequency, and gives practical advice on adapting your strategy for quality analysis and improving architectural design upfront. To view the recording, visit http://www.castsoftware.com/news-events/event/avoid-system-failure-idc?gad=ss
This examines the potential for the application of Design Science principles to the solution design process within solution architecture to improve the rigour and accuracy of solution designs.
Design Science is the structured and systematic process for creating designs that resolve problems. It is concerned with the structured process for the acquisition and application of knowledge in relation to the problems to the resolved and the solution knowledge to be applied.
The application of Design Science must be a means to an end – better solution quality – and not an end in itself – an incentive for the design function is to become large.
Solution architecture requires a (changing) combination of technical, leadership, interpersonal skills, experience, analysis, appropriate creativity, reflection and intuition applied in a structured manner.
Knowledge management – problem knowledge and solution knowledge – is at the core of the application of design science principles.
Knowledge management requires good management of the solution architecture function.
Crime Scene Investigation: Content – Who killed Enterprise Content Management? As consumer technology takes more attention, enterprise content management seems to have disappeared, particularly ECM. Presentation by John Newton was made at the Technology Services Group led by Dave Giordano at the University of Chicago Gleacher Center on 8 June 2011.
User Experience as an Organizational Development ToolDonovan Chandler
Developers sometimes begin a project by racing to the specification document and an ERD. Wait! Even if you're developing iteratively, there's a huge amount of potential being missed in most projects.
I propose that your projects will be more successful and valuable to your clients if you think of yourself not just as a database developer but as a process consultant. This presentation outlines a few concepts for addressing the human and political aspects of database system development and concludes with an example scenario.
This was presented at a FileMaker training session and is my first public presentation. Thank you for looking!
Solution Architecture And User And Customer ExperienceAlan McSweeney
User experience is the sum of experiences across all dimensions of all solutions and the user’s interaction with it including its functionality and quality attributes. It is the sum of all interactions with the solution and the results the solution provides. Solution usability is much, much more than a user interface
Users experience the complete operational solution across its entire scope and experience its functional and quality properties. The solution architect must be aware of the usability of designed solutions. Usability is not an afterthought: it must be embedded in the overall solution design from the start
The dimensions of solution usability are:
• Components of overall solution
• Functional components of solution
• Quality properties
The complete solution Is always much more than just a bunch of software. Implementing the end-to-end components of the solution positively impacts on solution usability and utility. Without the complete view there will be gaps in the usability of the solution.
Enterprise architecture needs to provide leadership in defining and implementing approach to measuring solution usability. Enterprise architecture needs to define standards and associated frameworks for
• Overall experience
• Solution usability
Each of these needs to include measurement and analysis framework. Solution architecture needs to incorporate these standards into solution designs. Individual solutions incorporate usability standards
Overall set of solutions comprise the experience.
Webinar on 4th Industrial Revolution, IoT and RPARedwan Ferdous
This is a summarized presentation on the 4th Industrial Revolution, the Internet of Things and Robotic Process Automation (RPA). especially for the undergrad students and recent graduates for getting an overview of the topics-based on global and local trends. Maximum contents are from online and those are cited with due respect at the end 3 slides.
The webinar was arranged by IEEE ISTT Student Branch, Bangladesh on 15th May 2020. The session was 2 hours long.
Note: Slide# 6 ~ 42 was taken from one of my earlier sessions, presented for the IEEE RU Student Branch. That slide can be found here: https://www.slideshare.net/RedwanFerdous/roadmap-to-4th-industrial-revolutioniot-iiot
Shadow IT And The Failure Of IT ArchitectureAlan McSweeney
The continued existence and growth of shadow IT gives IT architecture the opportunity show leadership. IT architecture can be the gateway for business IT solution requirements, from initial solution concept through to solution realisation.
Shadow IT is a set of reactions by business functions to an actual or perceived inability or unwillingness of the IT function to respond to business needs for IT solutions. There are many aspects of shadow IT:
• Shadow Projects
• Shadow Data
• Shadow Sourcing
• Shadow Development
• Shadow Solutions
• Shadow Support Arrangements
Shadow IT takes many forms and types
1. CUST – customised solution developed by a third-party
2. DEV – personal devices used to access business systems or authenticate access to hosted solutions used for business
3. DIY – end-user computing application developed by the business
4. HOME – organisation data sent to home devices to be worked on
5. MSG – public messaging and data exchange platforms
6. OPEN – open-source software used as a stand-alone solution or incorporated into other solutions
7. OUT – outsourced service solution
8. PROD – software product acquired by the business and implemented on organisation infrastructure
9. PUB – accessing organisation applications and data using public devices or networks
10. STOR – public data storage and exchange platforms
11. SVC – hosted software solution
Uncontrolled shadow IT represents a real risk to organisations. The experience from previous shadow IT examples is that they have resulted in real financial losses. IT architecture can and should take the lead in implementing structures and processes to mitigate risks while taking maximising the benefits of shadow IT.
Predictive Analytics in Practice - Breakfast Club 11th May 2017Bilot
Ashwin Kumar unlocked the power of predictive analytics & machine learning at Bilot's Breakfast Club. Find coming Breakfast Clubs: http://www.bilot.fi/en/events/
Read more about our services for Analytics: http://www.bilot.fi/en/solutions/data-and-analytics/
Investing Intelligently In The IT FunctionAlan McSweeney
Describes an approach to defining the competencies and capabilities required of the IT function and to use current levels of competence and importance of competency across all activity areas of the IT function to identify those areas at which getting better will yield the greatest return, allowing for targeted investment of resources to get good at what matters
Operational Risk Management Data Validation ArchitectureAlan McSweeney
This describes a structured approach to validating data used to construct and use an operational risk model. It details an integrated approach to operational risk data involving three components:
1. Using the Open Group FAIR (Factor Analysis of Information Risk) risk taxonomy to create a risk data model that reflects the required data needed to assess operational risk
2. Using the DMBOK model to define a risk data capability framework to assess the quality and accuracy of risk data
3. Applying standard fault analysis approaches - Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) - to the risk data capability framework to understand the possible causes of risk data failures within the risk model definition, operation and use
Solution Architecture and Solution AcquisitionAlan McSweeney
This describes a systematised and structured approach to solution acquisition or procurement that involves solution architecture from the start. This allows the true scope of both the required and subsequently acquired solution are therefore fully understood. By using such an approach, poor solution acquisition outcomes are avoided.
Solution architecture provides the structured approach to capturing all the cost contributors and knowing the true solution scope.
There is more packaged/product/service-based solution acquisition activity. There is an increasing trend of solutions hosted outside the organisation. Meanwhile solution acquisition outcomes are poor and getting worse.
Poor solution acquisition has long-term consequences and costs.
The to-be-acquired solution needs to operate in and co-exist with an existing solution topography and the solution acquisition process needs to be aware of and take account of this wider solution topography. Cloud-based or externally hosted and provided solutions do not eliminate the need for the solution to exist within the organisation solution topography.
Strategic misrepresentation in solution acquisition is the deliberate distortion or falsification of information relating to solution acquisition costs, complexity, required functionality, solution availability, resource availability, time to implement in order to get solution acquisition approval. Strategic misrepresentation is very real and its consequences can be very damaging.
Solution architecture has the skills and experience to define the real scope of the solution being acquired. An effective structured solution acquisition process, well-implemented and consistently applied, means dependable and repeatable solution acquisition and successful outcomes.
Dialogue Tool for Value Creation in Digital Transformation: Roadmapping for...Naoshi Uchihira
With the rapid spread of digital technologies into industry and society, the collaboration between humans and machines (artificial intelligence and ma-chine learning) becomes an important issue, but it is not clear what kind of value can be created by the collaboration between humans and machines. Roadmapping is effective as a dialogue tool for clarifying the value among stakeholders. However, the traditional roadmapping methods are insufficient since collaboration between humans and machines is a socio-technical system and evolves together while influencing each other. This paper proposes the new co-evolutionary technology roadmapping method and reports the results of the roadmapping workshop for machine learning applications.
This presentation describes systematic, repeatable and co-ordinated approach to agile solution architecture and design. It is intended to describe a set of practical steps and activities embedded within a framework to allow an agile method to be adopted and used for solution design and delivery. This approach ensures consistency in the assessment of solution design options and in subsequent solution design and solution delivery activities. This process leads to the rapid design and delivery of realistic and achievable solutions that meet real solution consumer needs. The approach provides for effective solution decision-making. It generates options and results quickly and consistently. Implementing a framework such as this provides for the creation of a knowledgebase of previous solution design and delivery exercises that leads to an accumulated body of knowledge within the organisation.
The Centre Cannot Hold: Making IT Architecture Relevant In A Post IT WorldAlan McSweeney
Business has a consistently poor experience of the internal IT function. It is now all too easy for the business to bypass the central IT function. There is a business shift to cloud service providers offering infrastructure-less solutions with no perceived IT involvement. Outsourcing and the divestment of IT functions in response to business wishes to remove the overhead. The business need to respond to the interrelated developments of digital, mobile and social computing and perceived inability of the central IT function to respond.
If the IT function cannot react to the requirements of the business due to business pressures, the business will go elsewhere. Shadow IT - the acquisition of IT solutions outside the control of the IT function - is an unpleasant and common reality. 50% of IT expenditure is routinely spent outside the control of the IT function. Shadow IT is a symptom of a post-IT world.
The central IT function loses relevance and control. Businesses reduce their reliance on the core IT function.
IT architecture should act as a glue joining the business strategy to the IT strategy. IT architecture needs to operate as an internal business consulting And advisory function. An effective business oriented IT architecture function can get the correct balance between too little and too much, too slowly and too quickly. The IT Architecture team needs to operate as a team rather than a set of siloed internally focussed IT roles, involving business as well as technologists.
Identifying knowledge value measurement in a company - june 2006Epistema
Paper Contribution for Knowledge Board Community - UK- after Conference: "Contactivity '06" - april 2006, at Business School of University of Greenwich - London
Translating Big Raw Data Into Small Actionable InformationAlan McSweeney
Any approach to Big Data needs to be based rigorously on business value. Big Data exists across the organisation’s operating landscape and not just for customers. Such data presents the potential for significant value that can enhance the way organisations do business and interact with external parties. There is a need for a realistic and achievable approach to translating Big Raw Data into Small Actionable Information.
Big Data is intrinsically linked to digital operations and associated digital transformation.
So ignore the issues of scope, lack of definition, conflicts, differences and complexity and focus on the identification, specification, development and implementation of approaches, strategies, processes, expertise, solutions and systems and data that can provide actionable information to achieve outcomes that produce business value.
The approach to generating real value needs to encompass:
1. Definition and understanding of Big Raw Data landscape including data sources, platforms, systems and applications parties, journeys and interactions
2. Identification and selection of high potential value use cases for implementation for selected parties
3. Definition of IT strategies, facilities, tools, techniques and resources to reduce the volume of Big Raw Data to translate it into Small Actionable Information
4. System and application changes to actualise use cases
5. Understanding and appreciation of wider operational context – Campaign Management, Customer Relationship Management, Customer Experience Management, Customer Value Management
6. Implementation of underpinning data governance and data privacy protocols
7. Organisational and process changes to identify, implement and operate use cases
There are only a limited number of actionable insights available from Big Raw Data. There are only a limited number of actions the organisation can reasonably take. It is important not to swamp the organisation with lots of irrelevant pseudo insights. It is important to prioritise the actions recommended from the derived insights.
Exploiting Big Raw Data to generate business value requires resources. This means management commitment and sponsorship.
Digital Enterprise Architecture: Four Elements Critical to Solution EnvisioningCognizant
For the digital enterprise, architecture of all varieties must evolve strategically in step with technological capabilities and business imperatives. Such a multidimensional approach includes automation, AI, analytics, big data management and digitization as a holistic phenomenon.
Practical Data Strategies in the real world of poor Data QualityAndrew Patricio
My presentation for EDW 2017
Foundation
Data Effectiveness
Data Sophistication
Data Prioritization
Consistency, Relevancy, Accuracy
Data Quality Culture
Reporting platform
Managing Requests
Summary
DataEd Slides: Data Management Best PracticesDATAVERSITY
It is clear that Data Management best practices exist, and so does a useful process for improving existing Data Management practices. The question arises: Since we understand the goal, how does one design a process for Data Management goal achievement? This program describes what must be done at the programmatic level to achieve better data use and a way to implement this as part of your data program. The approach combines DMBoK content and CMMI/DMM processes — permitting organizations the opportunity to benefit from the best of both. It also permits organizations to understand:
• Their current Data Management practices
• Strengths that should be leveraged
• Remediation opportunities
Big Data, Big Problems: Avoid System Failure with Quality Analysis - Webinar ...CAST
Do you want to make your systems more reliable and resilient before your organization becomes the next headline? View the slides from our recent webinar with Melinda Ballou, Program Director for IDC's Application Life-Cycle Management & Executive Strategies research.
Melinda discusses the trends driving recent high-profile outages with increasing frequency, and gives practical advice on adapting your strategy for quality analysis and improving architectural design upfront. To view the recording, visit http://www.castsoftware.com/news-events/event/avoid-system-failure-idc?gad=ss
This examines the potential for the application of Design Science principles to the solution design process within solution architecture to improve the rigour and accuracy of solution designs.
Design Science is the structured and systematic process for creating designs that resolve problems. It is concerned with the structured process for the acquisition and application of knowledge in relation to the problems to the resolved and the solution knowledge to be applied.
The application of Design Science must be a means to an end – better solution quality – and not an end in itself – an incentive for the design function is to become large.
Solution architecture requires a (changing) combination of technical, leadership, interpersonal skills, experience, analysis, appropriate creativity, reflection and intuition applied in a structured manner.
Knowledge management – problem knowledge and solution knowledge – is at the core of the application of design science principles.
Knowledge management requires good management of the solution architecture function.
Crime Scene Investigation: Content – Who killed Enterprise Content Management? As consumer technology takes more attention, enterprise content management seems to have disappeared, particularly ECM. Presentation by John Newton was made at the Technology Services Group led by Dave Giordano at the University of Chicago Gleacher Center on 8 June 2011.
User Experience as an Organizational Development ToolDonovan Chandler
Developers sometimes begin a project by racing to the specification document and an ERD. Wait! Even if you're developing iteratively, there's a huge amount of potential being missed in most projects.
I propose that your projects will be more successful and valuable to your clients if you think of yourself not just as a database developer but as a process consultant. This presentation outlines a few concepts for addressing the human and political aspects of database system development and concludes with an example scenario.
This was presented at a FileMaker training session and is my first public presentation. Thank you for looking!
Solution Architecture And User And Customer ExperienceAlan McSweeney
User experience is the sum of experiences across all dimensions of all solutions and the user’s interaction with it including its functionality and quality attributes. It is the sum of all interactions with the solution and the results the solution provides. Solution usability is much, much more than a user interface
Users experience the complete operational solution across its entire scope and experience its functional and quality properties. The solution architect must be aware of the usability of designed solutions. Usability is not an afterthought: it must be embedded in the overall solution design from the start
The dimensions of solution usability are:
• Components of overall solution
• Functional components of solution
• Quality properties
The complete solution Is always much more than just a bunch of software. Implementing the end-to-end components of the solution positively impacts on solution usability and utility. Without the complete view there will be gaps in the usability of the solution.
Enterprise architecture needs to provide leadership in defining and implementing approach to measuring solution usability. Enterprise architecture needs to define standards and associated frameworks for
• Overall experience
• Solution usability
Each of these needs to include measurement and analysis framework. Solution architecture needs to incorporate these standards into solution designs. Individual solutions incorporate usability standards
Overall set of solutions comprise the experience.
Webinar on 4th Industrial Revolution, IoT and RPARedwan Ferdous
This is a summarized presentation on the 4th Industrial Revolution, the Internet of Things and Robotic Process Automation (RPA). especially for the undergrad students and recent graduates for getting an overview of the topics-based on global and local trends. Maximum contents are from online and those are cited with due respect at the end 3 slides.
The webinar was arranged by IEEE ISTT Student Branch, Bangladesh on 15th May 2020. The session was 2 hours long.
Note: Slide# 6 ~ 42 was taken from one of my earlier sessions, presented for the IEEE RU Student Branch. That slide can be found here: https://www.slideshare.net/RedwanFerdous/roadmap-to-4th-industrial-revolutioniot-iiot
Shadow IT And The Failure Of IT ArchitectureAlan McSweeney
The continued existence and growth of shadow IT gives IT architecture the opportunity show leadership. IT architecture can be the gateway for business IT solution requirements, from initial solution concept through to solution realisation.
Shadow IT is a set of reactions by business functions to an actual or perceived inability or unwillingness of the IT function to respond to business needs for IT solutions. There are many aspects of shadow IT:
• Shadow Projects
• Shadow Data
• Shadow Sourcing
• Shadow Development
• Shadow Solutions
• Shadow Support Arrangements
Shadow IT takes many forms and types
1. CUST – customised solution developed by a third-party
2. DEV – personal devices used to access business systems or authenticate access to hosted solutions used for business
3. DIY – end-user computing application developed by the business
4. HOME – organisation data sent to home devices to be worked on
5. MSG – public messaging and data exchange platforms
6. OPEN – open-source software used as a stand-alone solution or incorporated into other solutions
7. OUT – outsourced service solution
8. PROD – software product acquired by the business and implemented on organisation infrastructure
9. PUB – accessing organisation applications and data using public devices or networks
10. STOR – public data storage and exchange platforms
11. SVC – hosted software solution
Uncontrolled shadow IT represents a real risk to organisations. The experience from previous shadow IT examples is that they have resulted in real financial losses. IT architecture can and should take the lead in implementing structures and processes to mitigate risks while taking maximising the benefits of shadow IT.
Predictive Analytics in Practice - Breakfast Club 11th May 2017Bilot
Ashwin Kumar unlocked the power of predictive analytics & machine learning at Bilot's Breakfast Club. Find coming Breakfast Clubs: http://www.bilot.fi/en/events/
Read more about our services for Analytics: http://www.bilot.fi/en/solutions/data-and-analytics/
Investing Intelligently In The IT FunctionAlan McSweeney
Describes an approach to defining the competencies and capabilities required of the IT function and to use current levels of competence and importance of competency across all activity areas of the IT function to identify those areas at which getting better will yield the greatest return, allowing for targeted investment of resources to get good at what matters
Operational Risk Management Data Validation ArchitectureAlan McSweeney
This describes a structured approach to validating data used to construct and use an operational risk model. It details an integrated approach to operational risk data involving three components:
1. Using the Open Group FAIR (Factor Analysis of Information Risk) risk taxonomy to create a risk data model that reflects the required data needed to assess operational risk
2. Using the DMBOK model to define a risk data capability framework to assess the quality and accuracy of risk data
3. Applying standard fault analysis approaches - Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) - to the risk data capability framework to understand the possible causes of risk data failures within the risk model definition, operation and use
Solution Architecture and Solution AcquisitionAlan McSweeney
This describes a systematised and structured approach to solution acquisition or procurement that involves solution architecture from the start. This allows the true scope of both the required and subsequently acquired solution are therefore fully understood. By using such an approach, poor solution acquisition outcomes are avoided.
Solution architecture provides the structured approach to capturing all the cost contributors and knowing the true solution scope.
There is more packaged/product/service-based solution acquisition activity. There is an increasing trend of solutions hosted outside the organisation. Meanwhile solution acquisition outcomes are poor and getting worse.
Poor solution acquisition has long-term consequences and costs.
The to-be-acquired solution needs to operate in and co-exist with an existing solution topography and the solution acquisition process needs to be aware of and take account of this wider solution topography. Cloud-based or externally hosted and provided solutions do not eliminate the need for the solution to exist within the organisation solution topography.
Strategic misrepresentation in solution acquisition is the deliberate distortion or falsification of information relating to solution acquisition costs, complexity, required functionality, solution availability, resource availability, time to implement in order to get solution acquisition approval. Strategic misrepresentation is very real and its consequences can be very damaging.
Solution architecture has the skills and experience to define the real scope of the solution being acquired. An effective structured solution acquisition process, well-implemented and consistently applied, means dependable and repeatable solution acquisition and successful outcomes.
Dialogue Tool for Value Creation in Digital Transformation: Roadmapping for...Naoshi Uchihira
With the rapid spread of digital technologies into industry and society, the collaboration between humans and machines (artificial intelligence and ma-chine learning) becomes an important issue, but it is not clear what kind of value can be created by the collaboration between humans and machines. Roadmapping is effective as a dialogue tool for clarifying the value among stakeholders. However, the traditional roadmapping methods are insufficient since collaboration between humans and machines is a socio-technical system and evolves together while influencing each other. This paper proposes the new co-evolutionary technology roadmapping method and reports the results of the roadmapping workshop for machine learning applications.
This presentation describes systematic, repeatable and co-ordinated approach to agile solution architecture and design. It is intended to describe a set of practical steps and activities embedded within a framework to allow an agile method to be adopted and used for solution design and delivery. This approach ensures consistency in the assessment of solution design options and in subsequent solution design and solution delivery activities. This process leads to the rapid design and delivery of realistic and achievable solutions that meet real solution consumer needs. The approach provides for effective solution decision-making. It generates options and results quickly and consistently. Implementing a framework such as this provides for the creation of a knowledgebase of previous solution design and delivery exercises that leads to an accumulated body of knowledge within the organisation.
The Centre Cannot Hold: Making IT Architecture Relevant In A Post IT WorldAlan McSweeney
Business has a consistently poor experience of the internal IT function. It is now all too easy for the business to bypass the central IT function. There is a business shift to cloud service providers offering infrastructure-less solutions with no perceived IT involvement. Outsourcing and the divestment of IT functions in response to business wishes to remove the overhead. The business need to respond to the interrelated developments of digital, mobile and social computing and perceived inability of the central IT function to respond.
If the IT function cannot react to the requirements of the business due to business pressures, the business will go elsewhere. Shadow IT - the acquisition of IT solutions outside the control of the IT function - is an unpleasant and common reality. 50% of IT expenditure is routinely spent outside the control of the IT function. Shadow IT is a symptom of a post-IT world.
The central IT function loses relevance and control. Businesses reduce their reliance on the core IT function.
IT architecture should act as a glue joining the business strategy to the IT strategy. IT architecture needs to operate as an internal business consulting And advisory function. An effective business oriented IT architecture function can get the correct balance between too little and too much, too slowly and too quickly. The IT Architecture team needs to operate as a team rather than a set of siloed internally focussed IT roles, involving business as well as technologists.
Identifying knowledge value measurement in a company - june 2006Epistema
Paper Contribution for Knowledge Board Community - UK- after Conference: "Contactivity '06" - april 2006, at Business School of University of Greenwich - London
Translating Big Raw Data Into Small Actionable InformationAlan McSweeney
Any approach to Big Data needs to be based rigorously on business value. Big Data exists across the organisation’s operating landscape and not just for customers. Such data presents the potential for significant value that can enhance the way organisations do business and interact with external parties. There is a need for a realistic and achievable approach to translating Big Raw Data into Small Actionable Information.
Big Data is intrinsically linked to digital operations and associated digital transformation.
So ignore the issues of scope, lack of definition, conflicts, differences and complexity and focus on the identification, specification, development and implementation of approaches, strategies, processes, expertise, solutions and systems and data that can provide actionable information to achieve outcomes that produce business value.
The approach to generating real value needs to encompass:
1. Definition and understanding of Big Raw Data landscape including data sources, platforms, systems and applications parties, journeys and interactions
2. Identification and selection of high potential value use cases for implementation for selected parties
3. Definition of IT strategies, facilities, tools, techniques and resources to reduce the volume of Big Raw Data to translate it into Small Actionable Information
4. System and application changes to actualise use cases
5. Understanding and appreciation of wider operational context – Campaign Management, Customer Relationship Management, Customer Experience Management, Customer Value Management
6. Implementation of underpinning data governance and data privacy protocols
7. Organisational and process changes to identify, implement and operate use cases
There are only a limited number of actionable insights available from Big Raw Data. There are only a limited number of actions the organisation can reasonably take. It is important not to swamp the organisation with lots of irrelevant pseudo insights. It is important to prioritise the actions recommended from the derived insights.
Exploiting Big Raw Data to generate business value requires resources. This means management commitment and sponsorship.
Digital Enterprise Architecture: Four Elements Critical to Solution EnvisioningCognizant
For the digital enterprise, architecture of all varieties must evolve strategically in step with technological capabilities and business imperatives. Such a multidimensional approach includes automation, AI, analytics, big data management and digitization as a holistic phenomenon.
Practical Data Strategies in the real world of poor Data QualityAndrew Patricio
My presentation for EDW 2017
Foundation
Data Effectiveness
Data Sophistication
Data Prioritization
Consistency, Relevancy, Accuracy
Data Quality Culture
Reporting platform
Managing Requests
Summary
DataEd Slides: Data Management Best PracticesDATAVERSITY
It is clear that Data Management best practices exist, and so does a useful process for improving existing Data Management practices. The question arises: Since we understand the goal, how does one design a process for Data Management goal achievement? This program describes what must be done at the programmatic level to achieve better data use and a way to implement this as part of your data program. The approach combines DMBoK content and CMMI/DMM processes — permitting organizations the opportunity to benefit from the best of both. It also permits organizations to understand:
• Their current Data Management practices
• Strengths that should be leveraged
• Remediation opportunities
Data-Ed: Unlock Business Value through Data Quality Engineering Data Blueprint
Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar focuses on obtaining business value from data quality initiatives. I will illustrate how organizations with chronic business challenges often can trace the root of the problem to poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from re-occurring.
You can sign up for future Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
The creation of a “data democracy” involves new challenges around organizational acceptance, licensing, user access, data comprehension and training than are typically encountered with a more standard business intelligence deployment. In fact, studies suggest that as few as 4% of American companies may have actually achieved this rare distinction of empowering more than 50% of its employees to directly access and use data. In this session, Mr. Aldrich will discuss these challenges and the solutions that he has designed to create a data democracy for the City Colleges of Chicago.
Predictive Analytics - How to get stuff out of your Crystal BallDATAVERSITY
Everyone wants to leverage data. The optimal implementation of analytics is an organization-wide set of capabilities. These are called advantageous organizational analytic capabilities in that a clear ROI is demonstrable from these efforts. Turns out that there are a number of prerequisites to advantageous organizational analytics. These include:
Adopting a crawl, walk, run strategy
Understanding current and potential organizational maturity and corresponding capabilities
Achieving an appropriate technology/human capability balance
Implementing useful IT systems development practices
Installing necessary non-IT leadership
This webinar will explore these and other topics using examples drawn from DOD, healthcare researchers, and donation center operations.
Founding a Data Democracy: How Ivy Tech is Leading a Revolution in Higher Edu...Brendan Aldrich
Is your data reliable, intuitive, interactive, and immediately available to everyone who needs it? This presentation explores how Ivy Tech, the nation's largest singly-accredited community college system, coupled cloud-based and open-source platforms with predictive analytics and sustainable data practices to create a cost-effective governed data democracy that's helping administrators, staff, and faculty access the data they need to drive student success.
Data-Ed Webinar: Implementing the Data Management Maturity Model (DMM) - With...DATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s data management capabilities. This model—based on the Capability Maturity Model pioneered by the U.S. Department of Defense for improving software development processes—allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and identify strengths to leverage. In doing so, this assessment method reveals organizational priorities, business needs, and a clear path for rapid process improvements.
In this webinar, we will:
- Describe the DMM model, its purpose and evolution, and how it can be used as a roadmap for assessing and improving organizational data management and data management maturity
- Discuss how to get the most out of a DMM assessment, including its dependencies and requirements for use
Who Owns Faculty Data?: Fairness and transparency in UCLA's new academic HR s...chloejreynolds
Abstract: Beginning in 2015, Opus will be the information system of record for faculty activities at the University of California, Los Angeles (UCLA). Opus will serve as both a profile system, storing data about faculty work, and as a workflow and approval engine for the promotion and tenure process. Opus leverages institutional master data wherever possible to collect data about faculty activity. However, re-purposing institutional data collected for purposes not related to academic review necessitates allowing data subjects (UCLA faculty), to contextualize and reframe the data for the review process. Collecting, displaying and storing these augmented records (master data with manually added metadata from faculty) has forced the project team to grapple with questions regarding fairness and transparency to both data subjects and to data consumers. How can we hold to “good design” and usability practices, while faithfully representing the inherent “messiness” of the data? How does the context in which the data was collected impact re-purposing the data for academic review? What does it mean to “own” faculty data? This paper outlines our attempts to address these questions, noting the trade-offs and limitations of the selected solutions.
This topic was presented at the 2015 iConference on March 26, 2015 in Newport Beach, CA. Since 2005, the iConference series has provided forums in which information scholars, researchers and professionals share their insights on critical information issues in contemporary society. An openness to new ideas and research fields in information science is a primary characteristics of the event.
Foundational Strategies for Trust in Big Data Part 2: Understanding Your DataPrecisely
Teams working on new initiatives whether for customer engagement, advanced analytics, or regulatory and compliance requirements need a broad range of data sources for the highest quality and most trusted results. Yet the sheer volume of data delivered coupled with the range of data sources including those from external 3rd parties increasingly precludes trust, confidence, and even understanding of the data and how or whether it can be used to make effective data-driven business decisions.
The second part of our webcast series on Foundation Strategies for Trust in Big Data provides insight into how Trillium Discovery for Big Data with its natively distributed execution for data profiling supports a foundation of data quality by enabling business analysts to gain rapid insight into data delivered to the data lake without technical expertise.
Conformed Dimensions of Data Quality – An Organized Approach to Data Quality ...DATAVERSITY
Are you looking to measure Data Quality in a more organized way? Look no further, use the Conformed Dimensions of Data Quality to organize your efforts, improve communication with stakeholders and track improvement over time. In this webinar, Information Quality practitioner Dan Myers will present the Conformed Dimensions of Data Quality framework along with the complete results of the 3rd Annual Dimensions of Data Quality survey. This presentation will provide the first view of the 2017 results, and all attendees will receive the associated whitepaper free.
In this webinar you will learn:
Why organizations use the Dimensions of Data Quality
Why there are so many options, and what he recommends you use
3rd Annual Survey data about how frequently organizations use the dimensions and specifically which dimensions are most used
Industry trends in adoption and more resources on the topic
Similar to Data Effectiveness: How to build a Data Driven and Reporting infrastructure (20)
Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...2023240532
Quantitative data Analysis
Overview
Reliability Analysis (Cronbach Alpha)
Common Method Bias (Harman Single Factor Test)
Frequency Analysis (Demographic)
Descriptive Analysis
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Machine learning and optimization techniques for electrical drives.pptx
Data Effectiveness: How to build a Data Driven and Reporting infrastructure
1. Andrew Pa tricio | T 5 7 1.216.2003 | www.dataeffec tiveness .com
Data Effectiveness
The Consistency, Accuracy, Relevancy cycle
Council of Great City Schools
Annual Academic, Information Technology
and Research Conference
11 July 2016
3. Agenda
Introduction
What is Data Effectiveness?
Data Reporting Issues
Data Quality Culture
Consistency, Relevancy, Accuracy
Reporting Platform
Managing Data Requests
Self Service Reporting
Summary
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 3
4. DCPS Facts
Led by Chancellor Kaya Henderson since 2010
48000+ students (steady increase for the past 5 years) in 111 schools
• 60 Elementary Schools
• 11 Middle Schools
• 18 Education Campuses (usually Preschool through 8th grade)
• 15 High Schools
• 7 other (Special Ed etc)
6800+ school staff (3600+ teachers)
600+ central office staff
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 4
5. About Me – Andrew Patricio
(former) Deputy Chief for Data Systems
• Nov 2010 to June 2016
• During most of my tenure was the most
senior technical resource at DCPS
(DCPS Deputy CIO created this year)
Personal background
• BS in Electrical Engineering
• IT & management consulting
• Current: www.dataeffectiveness.com
• Data Strategy Advising
• Building Reporting Infrastructure
• Helping Improve Data Quality
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 5
7. Data Driven Decision Making
All organizations seek to make decisions based on data
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 7
8. Data Reality
But the reality is that the data we have available is often in poor shape
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 8
9. Getting to Data Driven – Data Prep
Challenge is usually not analysis, it is getting the data ready to analyze
76% of data scientists find data prep the least enjoyable data science task
Source:
http://www.forbes.com/sites/gilpress/2016/03/23/data-preparation-most-time-consuming-least-enjoyable-data-science-task-survey-says/#7ee06c277f75
Getting the data is 90% of the work
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 9
10. Getting to Data Driven – Reporting vs Analytics
Steve Levitt, Freakonomics Podcast, 26 June 2014
“Yeah, I think the hardest single thing is that even if you have the desire … to be
data driven, that the existing systems…I never would have thought this before I
started working with companies. I never would have imagined that it is an I.T.
problem that you simply cannot get the data you want, and the data are held in
27 different data sets that have different identifiers … the I.T. support and the
complexity in these big firms blows your mind about how hard it is to do the
littlest, simple things.”
Data analysts are NOT necessarily technologists
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 10
11. Survey results – Reporting vs Analytics structure
80% have dedicated reporting team with analytics function distributed
throughout organization
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 11
12. DCPS Structure
Data Effectiveness
12District of Columbia Public Schools | 2015 CGCS CIO Conference
Data Reporting Team
Data and
Strategy
data
analysts
How to handle workload and deal with varying levels of data quality?
Data Effectiveness
Program data
analyst
Program data
analyst
…
Central Data reporting team pulls student
data from backend systems
• Complicated data pulls
• Data reporting self-service support
Data and Strategy data analysts collaborate
with data analysis in different offices
• Assists in vetting data requests
• Volume of requests means capacity is an issue
DCPS is very data driven so reporting tends
to be bottleneck
13. Data Driven Decision Making
High performance data analytics…
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 13
Requires pragmatic data reporting
…in the real world of data
14. Data Driven Pipeline
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 14
Data
Reporting
Data
Analytics
Effective
data
Programs /
Business
Effective
decisions
Effective
outcomes
Product of business is Effective Outcomes
Product of analytics is Effective Decisions
Product of reporting is Effective Data
15. What is Data Effectiveness?
Data Effectiveness is primary responsibility of reporting
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 15
Data
Reporting
Effective
data
Data
Analytics
Programs /
Business
Effective
decisions
Effective
outcomes
Being effectively data driven starts with Data Effectiveness:
Getting good data, when it is needed, to who needs it
16. Data Reporting Issues
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 16
17. How does Data go wrong?
Data entry issues
• Fat fingering
• Workarounds
• Solving immediate problem without thinking about
long term consequences
• Transactional system driven by latest action not
historical data changes important for reporting
• Poor understanding of process/policy
• Student Record Duplication
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 17
Legacy data
• Different definitions year to year (period to daily absence conversion)
• Poor QA processes (ISA definition incorrect)
• System transitions (Poor data transfer strategy from previous vendors)
18. Data Issues 1
End of year attendance example (1 particular school)
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 18
Date report run SY13-14 End of year Average Daily Attendance (ADA)
July 2014 95%
October 2014 92%
• How could attendance for the year change after the year is over?!
• Initially assumed that there was a bug in second report
• Turns out reason behind nonsensical error was that school registrar was changing
enrollment date from Aug 2013 to Aug 2014 so that those kids did not look like
they were enrolled in the 2013-2014 school year any longer
Result:
• Students who were present in SY13-14 data in June were missing in October,
severely skewing the data
19. Data Issues 2
Example: Enrollment overlaps
Student Information System (SIS) is transactional system, only tracks current state
• For enrollment it doesn’t care about data values in enrollment history
• Only cares about latest enrollment action (admit or withdrawal) and school
• Actual enrollment history in system is merely log of events
• Users can willy-nilly adjust enrollment history with no effect on current status
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 19
20. Data Issues 3
School Dashboard vs
Weekly reports
Idea was to get more
regularly updated data
to schools
Inconsistencies
reduced trust in data
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 20
Two different queries implementing the same metric but poor data quality
meant slightly different answers. Example:
• “Current School” on student table used for dashboard queries
• Didn’t always match school based on enrollment history used in reports
21. Survey Results – data issues
Challenges to data reporting at your district (11 responses)
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 21
Area total max min avg stddev
Requirements - business rules are communicated clearly 36 5 2 3.27 0.96
Siloed data - correlating data from different systems 36 5 1 3.27 1.29
Validity - keeping number of data entry errors low 35 5 1 3.18 1.27
Capacity - managing the number of report requests 33 5 2 3.00 1.35
Efficiency - quick delivery of report when requested 32 4 1 2.91 0.90
Complexity - reports requests easily coded 30 4 1 2.73 1.05
Repeatability - recreating same metric in various reports 28 5 1 2.55 1.44
Veracity - data values match reality 27 3 1 2.45 0.78
Reliability - data reports do not often need rework 26 4 1 2.36 0.88
Utility - data reports are useful and relevant 25 3 1 2.27 0.86
22. Fixing Data Quality
How do we make our data more effective given these challenges?
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 22
Improve Data Quality long term?
Make data driven decisions today?
23. Long term – Data Quality Culture
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 23
24. Legacy issues – Once you’ve identified and fixed these, they will not return
Ongoing issue is data entry problem
• Need to balance flexibility/freedom of entry with validation checks
• Most systems can validate based on patterns or entries but do not have enough flexibility to
differentiate between other valid and invalid entries
Data Entry - Front End Validation?
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 24
At DCPS, when a school doesn’t have the access to make a needed data change
they enter a data modification request for the tech team to handle
• strictness of data entry check needs to balance against technical team capacity
25. Reporting relationship to schools
All central office teams support schools
Difference with reporting team is that schools are really our "data entry team"
rather than our “users”
• Successful data reporting intimately tied to their effectiveness
• Perfect system which schools are not comfortable with will still have bad data quality
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 25
Data
Analytics
Programs /
Business
Effective
decisions
Data
Reporting
Effective
data
26. “Data Entry Team” rather than “Users”
“Data Entry Team” is a part of things, “Users” are on the other side
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 26
Taking this point of view automatically fosters more collaboration
• Connecting the dots for schools by tracing the pathway from a specific data entry error to specific
issue on data report instead of just mandating behavior top down
• Eg duplicate Attendance Intervention entries make situation look worse because of double counting
• Data error reports include step by step how to’s specifying how issue can be fixed
• Working to include direct link to relevant student in SIS to minimize context switching
• Focus groups, feedback sessions
• Getting school staff input on how to make data entry more efficient
Users Data Entry Team
27. Fixing Data
Error Correction Cycle
• Feed back errors to schools for them to correct
• Central office team looks for other common data entry errors to either prevent through
front-end validation or add to error reports going to schools
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 27
Data Integrity
Management
System (soon to
be Certica Certify)
Data
Error
Report
Reporting
Platform
Aspen
(SIS)
Central office reporting and data analyst teams
Improved SIS Validations
Additional Error Patterns
Fix Data Errors
28. Data Integrity Management System
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 28
29. Today – C.A.R. cycle
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 29
30. Being Data Driven requires a C.A.R.
Problem is how to build a train as it’s moving down the track. Even when data
quality is not so good you still have to provide reports and make decisions, you
cannot wait until everything is perfect because that’s a moving target
Good enough is good enough but what is good enough?
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 30
Consistency
Accuracy
Relevancy
31. Consistency, Accuracy, Relevancy cycle
Goal is to have accurate metrics aligned with business goal
• Cannot talk about accuracy if there isn’t agreement on the value being reported
• Once the value is consistent, you can talk about if it’s accurate
• Once it’s accurate you can talk about whether it’s relevant to business goal
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 31
Metric A
Report 1: 90
Report 2: 81
Report 3: 87
Metric A
Report 1: 87
Report 2: 87
Report 3: 87
Consistent
Metric A
Report 1: 85
Report 2: 85
Report 3: 85
Metric
aligned with
goal
Not
Relevant
Determine proposed change
and go through cycle again
Accurate Relevant
DATA INFORMATION KNOWLEDGE
32. Consistency – DATA
“What numerical value is being shown for this metric?”
Driven by reporting
Consistency means literally just that: a metric has the same value for the same
parameters no matter who pulls it. Matching reality is not the focus at this stage
Factors
• Traceability – same metric in different reports must be traced back to same source
• Same parameters – need to be careful because different metrics could be referred to by
the same common name
• “# of absences” – unexcused? ISA? Truancy?
• Time factor – legitimate changes can be made after report is run
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 32
Wkly Unex Abs Pulled Difference
100 Oct 12 First pull
95 Oct 19 Data corrected
90 Oct 26 Suspensions approved, don’t count as unexcused
33. Accuracy – INFORMATION
“Is the numerical value shown for this metric correct?”
Driven by Analytics
Once you have consistency, you can work on accuracy, ie does the value reflect
what is actually happening in reality
verify by comparing against manually calculated metric or physical audits
Metric could be “inaccurate” because
• Bug in query – fix
• Wrong or inconsistent business rules – nail down definitions, two different sets of
business rules for same metric could be appropriate (eg one school year vs another)
• Data quality – identify source and reason for poor data quality, make sure to verify
calculations using only good data quality data
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 33
34. Relevancy – KNOWLEDGE
“Is this metric helping to meet our goal?”
Driven by business
Once you have accuracy, then you can determine whether that metric is useful.
With consistency and accuracy, it may be that you are not measuring what you
thought you were. Or what you are measuring doesn’t really impact outcome.
If not, then either business goal or metric needs to change
• Changing metric
• Use new metric – longer to get consistency, cycle could be just as long or longer
• Refine business rules of existing metric – less effort to get consistency, shorter cycle
• Changing business goal
• Effective data in hand is worth two in the bush
• Tail could be wagging the dog but unmeasurable business goal is just a wish
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 34
35. Cycle
As data becomes information becomes knowledge, the data sophistication of the
process grows which requires more/different metrics
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 35
Different metrics could be at different points in the cycle
Accuracy
RelevancyConsistency
Accuracy
RelevancyConsistency
Accuracy
RelevancyConsistency
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
Acc
RelCons
37. Single system for operations and reporting
Previously used SIS as reporting data store and transactional system
• Made querying a bear due to complex data model for transactional system
• All reports required technical team capacity, even simple ones
• Highly normalized = even simple information was stored in a very complicated way
• All business rules were implemented in query code created by contractor
• Difficult to change when rules changed
• Often query code itself was only “documentation”
Example: find Residency Verification
select decode (afv.value,null,'N',438,'N','Y') end as "Residency
Verification SY13-14",
from students p, adhoc_fields_values afv, adhoc_fields_drop_downs afdd
where p.pupil_number = afv.pupil_number(+) and
afv.adhoc_fields_def_ID(+) = 109
and AFV.ADHOC_FIELDS_DEF_ID = AFDD.ADHOC_FIELDS_DEF_ID(+)
and afv.value = AFDD.FIELD_KEY_VALUE(+)
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 37
38. Reporting platform - Speed
Data model focused on reporting, not on transactions
• space vs speed tradeoff highly biased towards speed
• Virtually unlimited disk space
• Batch processing not real time
• Complete flexibility to organize data optimally for ease of reporting
• Central store for all siloed data (data-warehouse lite)
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 38
Student
Demographics
Enrollment
Attendance Base
Assessment
Course Credits
Example Transactional Data Model
Grad Progress
Simplified Reporting Data Model
Course Teachers
School Stats
39. Reporting platform – Ease of Use
Really nothing more than a dedicated reporting database, not data warehouse
Data model can be tailored for reporting
• Keeps track of all changes, not just latest data (valid from, valid to)
• Super flat, highly denormalized = easily understood data model
• Redundancy okay so long as we have data traceability
• Same base data stored in multiple formats/structures for different uses
• Fewer joins so can shift technical capacity to more complex business rules
• Can be exposed more directly to data analysts for increased self-service
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 39
select decode
(afv.value,null,'N',438,'N','Y') end
"Residency Verification", from students p,
adhoc_fields_values afv,
adhoc_fields_drop_downs afdd where
p.pupil_number = afv.pupil_number(+) and
afv.adhoc_fields_def_ID(+) = 109 and
AFV.ADHOC_FIELDS_DEF_ID =
AFDD.ADHOC_FIELDS_DEF_ID(+) and afv.value =
AFDD.FIELD_KEY_VALUE(+)
select [Residency
Verification] from
student_demographics_snapshot
40. Reporting platform - Consistency
Common processing
• Common query code centralized
• Batch ETL so can make multiple passes to pre-calculate higher order metrics
Consistent business rules
• Can have old and new metrics back-calculated as well (eg old vs new truancy rules)
• Calculate metric in one place so one number, right or wrong, is reported
Data Traceability
• Data path from systems of record to reports fully documented
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 40
Herding Kittens One Easy Powerful Cat
41. SSIS, SQL Server, Perl on
Virtual Machine servers
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 41
STARS
(legacy SIS)
SpEd Data Sys
Assessment
data dump
Assessment
data dump
Assessment
data dump
Assessment
data files
Assessment
data dump
Assessment
data dump
Assessment
data dump
Misc Data
Files
ELL data system
Misc SystemMisc SystemMisc System
ETL
SQL Server
Integration
Services
(SSIS),
Perl,
Manual
loads
Reporting
Platform
(MS SQL
Server)
Aspen
(Current SIS)
Data Mart
(MS SQL
Server)
Direct SQL (SQL
Server Management
Studio)
Reporting Platform Architecture
42. Reporting Platform Examples – Attendance base table
Based on weekly attendance report
Updated daily
Calculates individual student attendance metrics
Use values from this table whenever reporting on attendance
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 42
Metric Details
Truancy Calculates truancy based on old rules and new rules
so can compare trends
Absence Counts Period and Daily; Unexcused, Excused, In Seat
Attendance, Suspension
Attendance
Interventions
3, 5, 10, 20 day intervention letters needed and sent
Child & Family Service/Court referrals, Police Pickups
43. Reporting Platform Examples – enrollment matching
Enrollment admit withdraw matching
• SIS stores enrollment as separate admit and withdraw events
• Need to match admits to withdrawals for the same enrollment period and school
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 43
Admit Date Withdraw Date School
24 August 2011 24 June 2012 123
24 June 2012 10 October 2012 456
11 October 2012 1 January 3030 789
Date Type School
24 August 2011 Admit 123
24 June 2012 Withdrawal 123
24 June 2012 Admit 456
10 October 2012 Withdrawal 456
11 October 2012 Admit 789
Currently enrolled students have
“withdrawal date” in the far future
(1/1/3030) so that there is an actual date
and not a null to compare against:
(today() < [withdraw date])
as “currently enrolled”
44. Reporting Platform Examples - Assessment
Generally two ways we need to analyze assessments
• Single view of all assessments for a student – data in columns
• Each row is a single student for a particular school year
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 44
Student Assessment SY Score
123 A1 Q1 SY1415 90
123 A1 Q2 SY1415 80
123 A1 Q3 SY1415 70
123 A1 Q4 SY1415 100
456 A1 Sem 1 SY1415 65
Student A1 Q1 A1 Q1 A1 Q3 A1 Q4 A2 Sem 1 A2 Sem 2 SY
123 90 80 70 100 76 87 SY1415
456 60 70 80 90 65 86 SY1415
• Comparing one run of an assessment with another – data in rows
• Each row is a single assessment for a single student for a particular school year
Key is that both are processed from
the exact same data sets at the same
time so contain the same data
stored in two different structures
45. Reporting Platform Development
Biggest challenge was how to develop system when we had poor data quality
How could we avoid introducing more errors?
Solution
• Prioritize – Start with standard re-occurring reports (eg attendance weekly)
• Compartmentalize – Run reports using only students with no data quality issues
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 45
Aspen
(Current SIS)
46. Reporting Platform Development
Need to ensure that reporting platform is not introducing new errors. How?
Use only known good data to validate:
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 46
Report
validated
Fix any issues
with Reporting
platform
No discrepancies
discrepancies
Filter out
students with
bad data into
exceptions table
Reporting Platform
Report query
Standard Report
Sample Report
Why?
Compare
Bad data
students
Good data
students
47. Reporting Platform Development
1. Create Sample Report and compare to Standard Report (eg attendance
weekly)
2. Check for discrepancies
1. If discrepancy is due to mistake in reporting platform or query, fix it
2. If discrepancy is due to bad data, store student id in exceptions table
3. Pull Sample Report again, filtering out exception students so that only “Good
Data” is included in report
4. Continue until no discrepancies
Example “Bad Data” exceptions:
1. Overlapping enrollments
2. Absences outside of enrollment
3. Missing data
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 47
48. Managing Data Requests
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 48
49. Survey results – requestors of reports
Who is asking for data and most often with what frequency?
Counts are number of districts who report that frequency for that requester type
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 49
Who N/A Yearly Monthly Weekly Daily Ad-hoc
Parents/community 3 4 0 0 1 3
Students 6 1 0 0 2 2
Teachers 0 1 1 3 4 2
School administrators 0 0 2 4 4 1
Central office program staff 0 0 3 3 2 3
District leadership 0 0 3 5 3 0
State/Fed Dept of Ed or other
external org
1 3 2 1 2 2
Counts are number of districts who report that frequency for that requester type
50. Capacity vs Demand
Demand for data is ever increasing, people are hungry for data
Needed to do more with the same size team
Two Tracks
•Increase reporting efficiency
•Reduce demand on reporting team
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 50
51. Increase Efficiency
Users make requests via online “Data Request Tool” (DRT)
• Central point of communication with requestors for clarifications
• Tracks implementation notes and report writer assignments
• Report files attached to request along with query code
• One report can be attached to multiple requests to allow for reuse
• “Student Data Current” report available on front end
• Updated daily with common student metrics (absences, GPA, grade level, school, etc)
• User can customize columns/filters to download for themselves
• Example of some columns available:
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 51
Student_ID YTD_Unexcused_Absences Total SBT Suspension_Days
School_Name YTD_Excused_Absences Truant - still be truant?
ELL_Status YTD_ISA_Average_Attendance Truant_>=10_days
FARM_Status Membership_days Current_School_Average_Attendance
Student_Race Absences_Towards_Truancy Current_School_Excused_Absences
SPED_Status Suspension_Absences_Days Current_School_ISA_Average_Attendance
52. Increase Efficiency
“Data Request Tool” (DRT)
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 52
53. Increase Efficiency
Data Librarian is first point of contact for requests to reporting team
• Dedicated FTE position
• Clarifies request requirements
• Is there an already completed report that can fulfill this request?
• Acts as gatekeeper to qualify requests before they hit reporting capacity
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 53
Program
needs data
Standard Report?
Student Data Current?
Program Enters
Data Request
Data Librarian
clarifies request
Report
Created
Report Writer
assigned
Report
Reviewed
Existing report
available?
Report
Delivered
54. Self Service Reporting
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 54
55. Self Service Reporting
Goal was to provide self-service reporting to analysts while ensuring consistency
• Giving them raw access to reporting platform is too overwhelming
• Analysts are not database developers/DBAs
• Requires SQL skills: eg would still need joins, aggregations to get meaningful data
• Creating dedicated pull of custom data would mean another thing to maintain
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 55
Crawl before we can we walk,
Walk before we can run,
Run before we can fly
56. Self Service Reporting
Solution was to rely on already existing standard reports
• Enrollment Daily, Attendance Weekly, ACGR, Student Data Current
Weekly snapshot of each report was saved into a dedicated “data mart”
• Analysts were already used to seeing these reports so no learning curve
• These were official reports so data was guaranteed to match our official numbers
• Added benefit of saving historically reported official numbers
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 56
Not quite flying yet, but closer…
57. “Data Mart” example - Enrollment
Report #1612 is a daily report of the enrollment for every student in DCPS
• Forms the basis of how enrollment is monitored throughout the year
• Especially important during pre-enrollment for upcoming school year
• Also forms basis for denominator when calculating “percent of total students” metrics
• Data mart data model has exact same columns as DRT with addition of “report date”
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 57
Enrollment
Daily report
Reporting
Platform
Direct SQL
Enrollment
Data Mart
Enrollment DR 8/24
Enrollment DR 8/31
Enrollment DR 9/7
Enrollment DR 9/14
Enrollment DR 6/13
…
Data Analyst
58. Report requests hitting report writers
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 58
0
20
40
60
80
100
120
Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul
Report Writer Data Requests per Month
SY12-13 SY13-14 SY14-15 SY15-16
More self-service reporting and standardized reports
• Fewer ad-hoc requests for standard data
• Reporting capacity can be spent on more complex requests
60. Data Effectiveness
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 60
Data
Reporting
Effective
data
Data
Analytics
Programs
/ Business
Effective
decisions
Effective
outcomes
Data Driven Decision Making starts with Data Effectiveness
61. How to make data more effective
Consistency first, then Accuracy, then Relevancy
Data Effectiveness
District of Columbia Public Schools | 2015 CGCS CIO Conference 61
Metric A
Report 1: 90
Report 2: 81
Report 3: 87
Metric A
Report 1: 87
Report 2: 87
Report 3: 87
Consistent
Metric A
Report 1: 85
Report 2: 85
Report 3: 85
Metric
aligned with
goal
Accurate Relevant
Improve data quality by seeing School Staff as "data entry team" instead of “users”
Users Data Entry Team
62. Take Aways
Meet your data where it is today and build to where you want to be
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 62
Take some time to do the
work today that will help
you tomorrow…
•Data Quality Culture
•C.A.R. Cycle
•Data Request Process
There’s flying and then
there’s flying. Good enough
is probably good enough.
•Reporting Platform
•Data Marts
65. State of Data at DCPS
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 65
66. DCPS Data Systems Evolution
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 66
10+ years ago
•Critical data tracked poorly
or not at all
•Heavy manual effort in
using data to support
operations
•Very difficult to pull ad hoc
reports or change existing
reports
•Only way to do trend
reporting was via excel
5 years ago
•Systems of record exist but
data flow is both one way
and not easy
•All critical data is tracked
somewhere but in separate
systems
•Very Ad-hoc reporting,
every request was it’s own
query, “reuse” is copy-paste
•Trend reporting done from
yearly snapshots
Today
•Consolidated view of
student, employee, and
school data
•Robust reporting
infrastructure increases
capacity
•More self-service reporting
available
•Data quality better
•All changes tracked means
trend reporting much easier
Evolution from poor data capture to
Mainframe
replaced by
Oracle forms
System
(AAL eSIS)
SIS upgraded
(Follett Aspen)
Separate
reporting
database
created
Data Capture
Data Quality
Reporting ROI
67. Reporting requests – Example 1
Average of 80 complex data reporting data requests per month
Attendance Weekly
• Summarizes weekly and YTD attendance for every student at DCPS
• Absence stats: In Seat Attendance (ISA), Truancy, Unexcused Absences
• Intervention: 3, 5, 10, 15, 20 day attendance letters/meetings/etc
• Also includes behavior stats: Suspension, Suspension days
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 67
68. Reporting requests – Example 2
Adjusted Cohort Graduation Rate – calculates individual graduation progress per
credit for all high school students, flags degree of being off track
• Tracks 4 year cohorts across all high schools at DCPS
• Looks at grades in currently scheduled courses as well as credits received
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 68
On Grade Level On-Track 4 Core Subjects Enrolled in English III
Grade On-Track English Passing English III
On Potential Grad List On-Track Math Enrolled in English IV
Total # Credits On-Track Science Passing English IV
Credits Needed to Graduate On-Track Social Studies Enrolled in Math
English Cumulative Math Cumulative Passing Math
Example
Columns
70. Survey results – District Sizes
Survey conducted via CGCS CIO Mailing list: 11 Responses Total
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 70
72. Data Sophistication Cycle
Results oriented incompatible with data driven?
• In a results-oriented organization the push is to “get things done” and the
velocity of the need often makes it difficult for data systems to keep up.
• And as a result the data driven aspect gets starved of food
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 72
73. Data Sophistication Cycle
Data capture system evolves along with process sophistication
Reporting sophistication should keep pace with data quality
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 73
Example Data Entry
System
Key Data
structure
Process
Sophistication
Data
Quality
Reporting
Sophistication
Notepad Open entry
Excel Data cells
MS Access Data records
Student Information
System (SIS)
Normalized data
model
Reporting system
separate from SIS
Reporting data
model
Don’t build a formal data warehouse for excel “data systems”!
74. Data Effectiveness.
Don’t over engineer tracking system, should lead but not exceed process maturity.
Example
• Relative vs Absolute metrics
• High stakes (IMPACT) vs experimental (RTI)
Different groups or initiatives in the same organization could be at different points
in the cycle
Make sure you identify what level a particular need is at, no need for data tracking
or reporting to be more sophisticated than the business process in question
Data Effectiveness
Data Effectiveness CGCS Annual Academic, Information Technology and Research Conference 74