ISCRAM Impact Evaluation


Published on

Emerging technologies provide opportunities for the humanitarian responders’ community to enhance the effectiveness of their response to crisissituations. A part of this development can be contributed to a new type of information supply chains -driven by collaboration with digital, online communities- enabling organizations to make better informed decisions. However, how exactly and to what extend this collaboration impacts the decision making process is unknown. To improve these new information exchanges and the corresponding systems, an evaluation method is needed to assess the performance of these processes and systems. This paper builds on existing evaluation methods for information systems and design principles to propose such an impact evaluation framework. The proposed framework has been applied in a case study to demonstrate its potential to identify areas for further improvement in the (online) collaboration between information suppliers and users.

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • First step, how we can determine the impactWhy is it important? We develop a lot of tools and provide servicesHow effective is that, how can we better focus our efforts
  • Impact evaluation important part to further steer the development of the V&TC.Understand how the affect the response and how this can be improved.
  • Our approach: looking at business to derive theories and practicesThis requires translation
  • How do evaluations work, what are they made up of?
  • What are evaluations used for?
  • What can we use evalautions for?
  • How do we build our evaluation framework?
  • We have two types:Look at the outcomes or look at the process
  • Construct an evaluation framework:Ingredients:Objective (Assesment, evaluate project, evaluate program), Scope, Measurments, IndicatorsAll derived from anotherHow does it fit into a process
  • IT is a small part of the entire operationLet start by examinng the scope,Main focus the use of products in the processes and organization of the desiscion makerNot including the impact on the decision made.With each level more factors (confounding variables) are introduced, blurring the measurment.
  • Next look at the measurments, again two views:-The efficiency for the generation of products-The effectivenss of the use of those products-Combined they yield the impact
  • From measurements we derive indicators. Various existing framework are used to find the indicators and are then translated to the V&TC community.
  • Two case studies used to test framwork. It does not give general informationObjective to detect diferences between two IS deploymentsSome similarities some differences. Results are checked with interviews
  • Example output of evaluation frameworkGood in data collection and facilitatiesHigh prouctivy: for limited amount of time
  • Systems:High availbility and usablity in the V&TC systemsProcesses limited:Capacity increase, cost effective
  • Yes, we can design an evaluation framework that provides usefull resultsThe analysis show that the framework is able to pick up on key differences between ‘regular’ and V&TC deployments. Main point: integration and sustainbilityFor framework: embed in process, easy-to-use and part of deployementStill depend on sampling, need quantifable indicators and automation
  • The thesis was to demonstrate the feasibility and explore the domain. We succeededNOT to develop an accurate framework but lead by example and discussionThe biggest question: how are we going to use evaluation and wherefor.Importance: manage pool of growing options for DM, and improve quality of systems realtimeThis will determine the refinements and research efforts for further development
  • ISCRAM Impact Evaluation

    1. 1. Designing towards an impactevaluation framework for acollaborative information supply chainKENNY MEESTERS, BARTEL VAN DE WALLEISCRAM BADEN-BADEN, MAY 2013
    2. 2. OutlineOutlineDomain ProcessesEvaluationtypesSystemsevaluationEvaluationperspectivesScope MeasurementConcept Supply UsageFindingsObjectivesIndicatorsConclusionResultsV&TCDesignApproachResearchFuture work
    3. 3. V&TCData collection•Media•Geo-location•SMSDataprocessing•Analysis•VerificationDissemenation•Informationproducts•Maps, reports, etcInformationconsumers•Decision making•MonitoringOutline
    4. 4. Volunteer TrainingOpen*SBTFTransitionImpact Evaluation Decision MakersData ScrambleData licensingPreparednessUN OCHAIMISCRAMIMMAP, Google,MapactionWoodrow, UN,Harvard, OSMMapaction,GISCorp,MunsterISCRAM,ICT4Peace,MapactionSBTF, UNVISCRAM,Harvard,UvTTBCUNV,Munster“The challenge is to improve coordination between the structured humanitariansystem and the relatively loosely organized volunteer and technicalcommunities. ” -Valerie Amos, UN Under-Secretary-GeneralV&TCISCRAM,Harvard,UvTImpact EvaluationImpact Evaluation
    5. 5. What do we need?What do we want to know?What do we know?BusinessV&TCPracticeTheoryImpact Evaluation
    6. 6. What do we need?In general…BusinessV&TCPracticeTheory (2) MeasureStatus quoNew situation(1) DefineIndicatorsSituations(3) AnalyzeComparisonConclusion
    7. 7. In general…ApplicationsBusinessV&TCPracticeTheoryProjectA1. Impact evaluation2. Impact assessment3. Program evaluationProject B
    8. 8. BusinessV&TCPracticeTheory•Determine how well specific initiatives perform•Adjust and fine tune specific decisions/projects•Determine ‘best’ response•Manage provided solutions•Secure resources•AdvocateV&TC• Impact evaluation• Impact assessment• Program evaluationUse for V&TCApplications
    9. 9. BusinessV&TCPracticeTheoryUse forV&TC• Design principles of frameworks• Types• Measurements• Indicators• V&TC• Objective• Scope and focus• Indicators• Evaluate the framework• Case studies• Refine-able, usable toolsNext steps
    10. 10. Next stepsEvaluation typesEvaluationPerspectiveSystemsGeneral FormativeResourcecenteredEfficiencyorientedGoal centeredEffectivenessorientedSummative
    11. 11. Evaluation typesSystem evaluationEvaluationPerspectiveEfficiencyorientedEffectivenessorientedSystemsResourceinvestmentProductioncapabilityResourceconsumptionOrganizationOrganizationalperformanceUser performanceSystemperformance
    12. 12. System evaluationEvaluation implementationOrganizational performanceDepartment A(Sub)Project A(Sub)Project B(Sub)Project CDepartment BEvaluationframeworkProject ManagementEfficiency orientedEffectiveness oriented
    13. 13. ScopeOverall impact ofthe response to a crisisImpact of the decision makingprocess on crisisImpact of information productson the decision making processEffect of data processingon information productsImpact of data collectionon data processingSoft- en hardware impacton the system performance1. implementation
    14. 14. level 0:Request / definitionlevel 1:Resource allocationlevel 2:Team capabilitylevel 3:InvestmentsImpactlevel 1:Support & informationlevel 2:Decision makinglevel 3:Response effectivenessScopeMeasurementsEfficiency-orientedperspectiveEffectiveness-orientedperspectiveSystemimplementationProduct generationefficiencyResponseeffectivenessV&TC deployment
    15. 15. MeasurementsIndicatorslevel 1:Resource allocationlevel 2:Team capabilitylevel 1:Support & informationlevel 2:Decision makingImpactObjective Performance measure Applied toV&TCSystemdevelopmentFacilities allocation Availability of required (tech.) facilitiesSchedule compliance Time required to setup required systemsRequirements definition The clarity of requested productsOperationalresourcesData collection efforts Time/effort required to analyze dataSystem maintenance Time/effort required to maintain systemTraining/support/comm Efforts for user assistance.Objective Performance measure Applied toV&TCTeam capacityProductivity rate Level ofV&TC body deploymentRequired man-hours The total amount of hours usedOperationalcapabilityThroughput Products delivered/users servedUtilization rate Hours to product ratioResponse time Turn-around time on specific requestsObjective Performance measure Applied toV&TCSystem qualityUsability Ease of use of information productsSystem features Customization of information productsAccess / Availability Ease to reach information productsInformation qualityUnderstandability Presentation of gathered informationConsistency Provided information is consistentImportance / Relevance Relevance of provided informationObjective Performance measure Applied toV&TCIndividual impactAwareness / Recall Better situational awarenessDecision effectiveness Enhanced effectiveness of jobIndividual productivity Increased personal productivityOrganizationalimpactCost-effective Information products save resourcesIncreased capacity Increased effectiveness of operationsOverall productivity Potentially improved outcomes
    16. 16. 4 suppliers, 7 consumers7 suppliers, 12 consumersIndicatorsCase StudyDeveloppers and entry teamSpecific knowledgeTime criticalNo budgetGeographically seperateUsers are ‘unkown’Developpers vs data entryExpertise availableTime limitedLimited budgetLocated in 1 officeDirect contact w. users==≈≈≠≠NGODHN
    17. 17. Case StudyInformation SupplySystem NGO V&TCFA:SC:RD:FacilitiesallocationSchedulecomplianceRequirementsdefinitionData collectioneffortsSystemmaintenanceTraining, support andcommunicationOperational resources System developmentNGO Development V&TC DeployementResource NGO V&TCDC:SM:TS:ProductivityRequiredman-hoursThroughputUtilizationrateResponsetimeOperational capability Team capacityNGO Development V&TC DeployementSystem NGO V&TCPD:MH:Resource NGO V&TCTP:UR:RT:Level 2: CapabilitiesLevel 1: Resources
    18. 18. Information SupplyInformation useSystem NGO V&TCSF:AV:US:Inform. NGO V&TCUS:CO:IM:Individ. NGO V&TCUS:AW:EF:PR:Organ. NGO V&TCOP:CI:CE:US:Level 2: ProcessesLevel 1: InformationAvailabilityUsabilitySystemfeaturesUnderstandabilityConsistencyImportanceNGO Development V&TC DeployementSystem quality Information qualityUsageAwarenessEffectivenessProductivityUsageCost-effectiveCapacity increaseOverall…NGO Development V&TC DeployementIndividual impact Organization impact
    19. 19. Information useFindings•Agile vs.Waterfall•Organizational use•Strong integration•Requirement analysis•Sample selection•Identifying population•Also for other information supply chains• Difference in system use• Increasing impact• Improving evaluation
    20. 20. Future workV&TCFeedback:IncreasedeploymentimpactAdvocacy:SecureresourcesManage:ImproveproductsFindings•Historical data•Feedback loops•Add/remove variables•Scope of evaluation• Refinements• Framework design• ApplicationApply frameworkSelect caseImpact EvaluationFrameworkSelectparticiapantsControlgroupConductedInterviewApply frameworkSelectparticiapantsControlgroupConductedInterviewVerify results Verify resultsRefineFrameworkStatisticalanalysisData storeModel Refinement loopImpact evaluationoutcomeEvaluation approachV&TCFeedback:IncreasedeploymentimpactAdvocacy:SecureresourcesManage:ImproveproductsFeedback:Managepool ofresourcesAdvocacy:Commonunderstan-diing of ISimpactManage:Identifygaps, ensure good fitCoordinationFeedback:Managepool ofresourcesAdvocacy:Commonunderstan-diing of ISimpactManage:Identifygaps, ensure good fitCoordinationFeedback:Improveeffectivenssby IS useAdvocacy:Articulateneeds, andrequire-mentsManage:Improve ISuse in futureresponsesDecision MakersFeedback:Improveeffectivenssby IS useAdvocacy:Articulateneeds, andrequire-mentsManage:Improve ISuse in futureresponsesDecision MakersImpact Evaluation for theV&TC:Communicate, Learn, Advocate