• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
ISCRAM 2013: Designing towards an impact evaluation framework for a collaborative information supply chain
 

ISCRAM 2013: Designing towards an impact evaluation framework for a collaborative information supply chain

on

  • 487 views

Author: Kenny Meesters, Bartel van de walle

Author: Kenny Meesters, Bartel van de walle
Tilburg University

Statistics

Views

Total Views
487
Views on SlideShare
209
Embed Views
278

Actions

Likes
1
Downloads
0
Comments
0

3 Embeds 278

http://www.disasterresiliencelab.org 239
http://www.weebly.com 36
http://translate.googleusercontent.com 3

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • First step, how we can determine the impactWhy is it important? We develop a lot of tools and provide servicesHow effective is that, how can we better focus our efforts
  • Impact evaluation important part to further steer the development of the V&TC.Understand how the affect the response and how this can be improved.
  • Our approach: looking at business to derive theories and practicesThis requires translation
  • How do evaluations work, what are they made up of?
  • What are evaluations used for?
  • What can we use evalautions for?
  • How do we build our evaluation framework?
  • We have two types:Look at the outcomes or look at the process
  • Construct an evaluation framework:Ingredients:Objective (Assesment, evaluate project, evaluate program), Scope, Measurments, IndicatorsAll derived from anotherHow does it fit into a process
  • IT is a small part of the entire operationLet start by examinng the scope,Main focus the use of products in the processes and organization of the desiscion makerNot including the impact on the decision made.With each level more factors (confounding variables) are introduced, blurring the measurment.
  • Next look at the measurments, again two views:-The efficiency for the generation of products-The effectivenss of the use of those products-Combined they yield the impact
  • From measurements we derive indicators. Various existing framework are used to find the indicators and are then translated to the V&TC community.
  • Two case studies used to test framwork. It does not give general informationObjective to detect diferences between two IS deploymentsSome similarities some differences. Results are checked with interviews
  • Example output of evaluation frameworkGood in data collection and facilitatiesHigh prouctivy: for limited amount of time
  • Systems:High availbility and usablity in the V&TC systemsProcesses limited:Capacity increase, cost effective
  • Yes, we can design an evaluation framework that provides usefull resultsThe analysis show that the framework is able to pick up on key differences between ‘regular’ and V&TC deployments. Main point: integration and sustainbilityFor framework: embed in process, easy-to-use and part of deployementStill depend on sampling, need quantifable indicators and automation
  • The thesis was to demonstrate the feasibility and explore the domain. We succeededNOT to develop an accurate framework but lead by example and discussionThe biggest question: how are we going to use evaluation and wherefor.Importance: manage pool of growing options for DM, and improve quality of systems realtimeThis will determine the refinements and research efforts for further development

ISCRAM 2013: Designing towards an impact evaluation framework for a collaborative information supply chain ISCRAM 2013: Designing towards an impact evaluation framework for a collaborative information supply chain Presentation Transcript

  • Designing towards an impact evaluation framework for a collaborative information supply chain KENNY MEESTERS, BARTEL VAN DE WALLE ISCRAM BADEN-BADEN, MAY 2013
  • Outline Outline Domain Processes Evaluation types Systems evaluation Evaluation perspectives Scope Measurement Concept Supply Usage Findings Objectives Indicators Conclusion Results V&TC Design Approach Research Future work
  • V&TC Data collection •Media •Geo-location •SMS Data processing •Analysis •Verification Dissemenation •Information products •Maps, reports, etc Information consumers •Decision making •Monitoring Outline
  • Volunteer Training Open* SBTF Transition Impact Evaluation Decision Makers Data Scramble Data licensing Preparedness UN OCHA IM ISCRAM IMMAP, Google, Mapaction Woodrow, UN, Harvard, OSM Mapaction, GISCorp, Munster ISCRAM, ICT4Peace, Mapaction SBTF, UNV ISCRAM, Harvard, UvT TBC UNV, Munster “The challenge is to improve coordination between the structured humanitarian system and the relatively loosely organized volunteer and technical communities. ” -Valerie Amos, UN Under-Secretary-General V&TC ISCRAM, Harvard, UvT Impact Evaluation Impact Evaluation
  • What do we need?What do we want to know?What do we know? BusinessV&TC Practice Theory Impact Evaluation
  • What do we need?In general… BusinessV&TC Practice Theory (2) Measure Status quo New situation (1) Define Indicators Situations (3) Analyze Comparison Conclusion
  • In general…Applications BusinessV&TC Practice Theory ProjectA 1. Impact evaluation 2. Impact assessment 3. Program evaluation Project B
  • BusinessV&TC Practice Theory •Determine how well specific initiatives perform •Adjust and fine tune specific decisions/projects •Determine ‘best’ response •Manage provided solutions •Secure resources •AdvocateV&TC • Impact evaluation • Impact assessment • Program evaluation Use for V&TCApplications
  • BusinessV&TC Practice Theory Use forV&TC • Design principles of frameworks • Types • Measurements • Indicators • V&TC • Objective • Scope and focus • Indicators • Evaluate the framework • Case studies • Refine-able, usable tools Next steps
  • Next stepsEvaluation types Evaluation Perspective Systems General Formative Resource centered Efficiency oriented Goal centered Effectiveness oriented Summative
  • Evaluation typesSystem evaluation Evaluation Perspective Efficiency oriented Effectiveness oriented Systems Resource investment Production capability Resource consumption Organization Organizational performance User performance System performance
  • System evaluationEvaluation implementation Organizational performance Department A (Sub) Project A (Sub) Project B (Sub) Project C Department B Evaluation framework Project Management Efficiency oriented Effectiveness oriented
  • Scope Overall impact of the response to a crisis Impact of the decision making process on crisis Impact of information products on the decision making process Effect of data processing on information products Impact of data collection on data processing Soft- en hardware impact on the system performance 1. 2. 3. 4. 5. 6. SUPPLIERCONSUMER Evaluation implementation
  • level 0: Request / definition level 1: Resource allocation level 2: Team capability level 3: Investments Impact level 1: Support & information level 2: Decision making level 3: Response effectiveness ScopeMeasurements Efficiency-oriented perspective Effectiveness- oriented perspective System implementation Product generation efficiency Response effectiveness V&TC deployment
  • MeasurementsIndicators level 1: Resource allocation level 2: Team capability level 1: Support & information level 2: Decision making Impact Objective Performance measure Applied toV&TC System development Facilities allocation Availability of required (tech.) facilities Schedule compliance Time required to setup required systems Requirements definition The clarity of requested products Operational resources Data collection efforts Time/effort required to analyze data System maintenance Time/effort required to maintain system Training/support/comm Efforts for user assistance. Objective Performance measure Applied toV&TC Team capacity Productivity rate Level ofV&TC body deployment Required man-hours The total amount of hours used Operational capability Throughput Products delivered/users served Utilization rate Hours to product ratio Response time Turn-around time on specific requests Objective Performance measure Applied toV&TC System quality Usability Ease of use of information products System features Customization of information products Access / Availability Ease to reach information products Information quality Understandability Presentation of gathered information Consistency Provided information is consistent Importance / Relevance Relevance of provided information Objective Performance measure Applied toV&TC Individual impact Awareness / Recall Better situational awareness Decision effectiveness Enhanced effectiveness of job Individual productivity Increased personal productivity Organizational impact Cost-effective Information products save resources Increased capacity Increased effectiveness of operations Overall productivity Potentially improved outcomes
  • 4 suppliers, 7 consumers7 suppliers, 12 consumers IndicatorsCase Study Developpers and entry team Specific knowledge Time critical No budget Geographically seperate Users are ‘unkown’ Developpers vs data entry Expertise available Time limited Limited budget Located in 1 office Direct contact w. users = = ≈ ≈ ≠ ≠ NGODHN
  • Case StudyInformation Supply System NGO V&TC FA: SC: RD: Facilities allocation Schedule compliance Requirements definition Data collection efforts System maintenance Training, suppor t and communication Operational resources System development NGO Development V&TC Deployement Resource NGO V&TC DC: SM: TS: Productivit y Required man-hours Throughpu t Utilization rate Response time Operational capability Team capacity NGO Development V&TC Deployement System NGO V&TC PD: MH: Resource NGO V&TC TP: UR: RT: Level 2: Capabilities Level 1: Resources
  • Information SupplyInformation use System NGO V&TC SF: AV: US: Inform. NGO V&TC US: CO: IM: Individ. NGO V&TC US: AW: EF: PR: Organ. NGO V&TC OP: CI: CE: US: Level 2: Processes Level 1: Information Availability Usability System features Understandabi lity Consistency Importance NGO Development V&TC Deployement System quality Information quality Usage Awareness Effectiveness ProductivityUsage Cost-effective Capacity increase Overall… NGO Development V&TC Deployement Individual impact Organization impact
  • Information useFindings •Agile vs.Waterfall •Organizational use •Strong integration •Requirement analysis •Sample selection •Identifying population •Also for other information supply chains • Difference in system use • Increasing impact • Improving evaluation
  • Future work V&TC Feedback: Increase deployment impact Advocacy: Secure resources Manage: Improve products Findings •Historical data •Feedback loops •Add/remove variables •Scope of evaluation • Refinements • Framework design • Application Apply framework Select case Impact Evaluation Framework Select particiapants Control group Conducted Interview Apply framework Select particiapants Control group Conducted Interview Verify results Verify results Refine Framework Statistical analysis Data store Model Refinement loop Impact evaluation outcome Evaluation approach V&TC Feedback: Increase deployment impact Advocacy: Secure resources Manage: Improve products Feedback: Manage pool of resources Advocacy: Common understan- diing of IS impact Manage: Identify gaps, ensur e good fit Coordination Feedback: Manage pool of resources Advocacy: Common understan- diing of IS impact Manage: Identify gaps, ensur e good fit Coordination Feedback: Improve effectivenss by IS use Advocacy: Articulate needs, and require- ments Manage: Improve IS use in future responses Decision Makers Feedback: Improve effectivenss by IS use Advocacy: Articulate needs, and require- ments Manage: Improve IS use in future responses Decision Makers Impact Evaluation for theV&TC: Communicate, Learn, Advocate