SlideShare a Scribd company logo
1 of 29
Evaluation Criteria



                              FITT
– Fostering Interregional Exchange in ICT Technology Transfer –



                                 www.FITT-for-Innovation.eu
Except where otherwise noted, this work is licensed under a Creative Commons Attribution 3.0 License.
Criteria for evaluation of transfer projects


• This practice is designed to assist in the preliminary assessment of research-
  grounded technology projects for their commercialization potential in the realm
  of technology transfer.

• The process of assessing research projects is necessitated by the high failure
  rate, and resulting high cost, of technologies either prior to reaching the
  market or once in the market.

• The Evaluation Criteria are intended to provide guidance for assessing an
  idea, a technology or a research project, at an early-stage of technology
  transfer (thus prior to product development).




2 | March 2011                     Evaluation criteria
The evaluation process

• Project evaluation may take place at various stages
     Early-stage (proof-of-concept “maturation” towards technology transfer)
     Pre-incubation  Incubation
• Our focus is Early Stage Project Evaluation, which may appear
     In a continuous manner (or at regular intervals)
     Based on a CFP (Call For Proposal, typically once per year)
• Such early stage evaluation covers :
     Evaluation criteria
     A process for the application of these criteria, including the structure/organization
        of the evaluation committee

            The current practice focuses on recommended Evaluation
             Criteria
3 | March 2011                          Evaluation criteria
Illustration of the evaluation process

                                                                                                                                                     Incubation entry
                                                                                                                                                        evaluation
                                                                                                                          Pre-incubation entry
                                                           Early-stage eval                                                    evaluation


Research                                                 Development                                            Proof-of-concept                                        Market




                                                                                                                             Process :
      Evaluation criteria
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd
                                                 Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
                                                 Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa
                                                                                                                             • Description of project to be evaluated
                                                                                                                             (document)
  Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa
  Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa
  Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee



                                                                                                                             • Evaluation criteria
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   EeeeeBbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa
  Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee


                                                                                                                             • Jury (evaluation committee)
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee
  Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd   Eeeee




 4 | March 2011                                                                                                                Evaluation criteria
Coverage/definition


• Evaluation criteria should cover three main aspects of a project:
      Technical aspects
      Market assessment
      Team considerations
• They should be defined and published in advance in order to allow the
  evaluated teams to adapt to the process

• They will be used to establish the overall process, evaluation documents and
  the selection committee

• Evaluation criteria may be used by the evaluation committee to
      Allocate funds/resources to selected projects
      Provide consultancy to the project team (for example, to coach the team on
         aspects considered as “weak”)

5 | March 2011                           Evaluation criteria
Evaluation criteria


 Possible evaluation criteria
                             • Lots of possible evaluation methods/criteria are mentioned
                                 in the literature
                             • Several possible groups of criteria :
Originality of the innovation                   Profile of the inventor             Positive ROI/NPV estimations

Scientific return/opportunities       for   the Business opportunity                Venture value
laboratory
Project feasibility                             Market opportunities/threats        Regulatory constraints

Potential users                                 IP (protection issues, prior art)   Business model

Scientific relevance of the project             Lab support                         Financial return

Team aspects                                    Realism of the announced plan       Social & economical impact

Risk management                                 Potential applications              Production issues




 6 | March 2011                                               Evaluation criteria
Focus on first-stage evaluation criteria


 Most important criteria for first-stage evaluation
                                                                      Positive ROI/NPV calculations
Originality of the innovation     Profile of the inventor
                                                                      Venture value
Scientific return/opportunities   Business opportunity
for the laboratory
                                                                      Regulatory constraints
Project feasibility               Market opportunities/threats
                                                                      Business model
Potential users                   IP (protection issues, prior art)
                                                                      Financial return
Scientific relevance of the       Lab support
project
                                                                      Social & economical impact
Team aspects                      Realism of the announced plan
                                                                      Production issues
Risk management                   Potential applications


                  Deemed premature for the evaluation early-stage technology
                      transfer projects (prior to product development).
 7 | March 2011                        Evaluation criteria
The DIGITEO example - Global positioning


The OMTE checklist is used for maturation projects :




 8 | March 2011                    Evaluation criteria
Timing of the annual call for proposal



                              → March : launch of call for proposal/deadline for
                              submissions

  Long                        → April : preselection of 10 projects
selection
                              → May: coaching by Digiteo’s marketing team
 process
                              → June/July : final application, oral présentation,
                              deliberation, final decision



                              → September : launch of proof-of-concept activities
                              for the projects selected

                                      Process of Digiteo’s maturation
                                               programme

  9 | March 2011             Evaluation criteria
From proposal to selection

 • ~ 10 proposals per year

 • Preselection classification performed by Digiteo’s scientific committee and
   marketing staff

 • Coaching : work on the three components technology/marketing/IP of the
   proposal and submit presentation for the final selection

 • Selection process :
        External experts (technology transfer specialists from : industry cluster,
         incubator, Paris region, OSEO innovation fund, chamber of commerce, etc.)
        Digiteo’s technology transfer committee
        Formal selection announced by Digiteo’s steering committee

 • Average of 5 projects selected per year


10 | March 2011                        Evaluation criteria
Selection steps


4. Final decision                   DIGITEO’s Steering Committee




                                                                           Technology
           Scientific                                                       Transfer
          Committee                          Expert Panel
                                                                           Committee



1. Scientific relevance                      2. TT potential          3. Recommandations
        Technical differentiation                 Value creation




  11 | March 2011                               Evaluation criteria
Digiteo’s evaluation checklist




12 | March 2011             Evaluation criteria
DIGITEO – Method/criteria

• Evaluation criteria used for the OMTE call for projects :
      « Product/technology » aspects
            Originality/uniqueness and scientific relevance, project feasibility and opportunities
             created for the laboratory.
      « Market » aspects
            Ongoing research contracts and IP related to the project, first applications and users
             considered.
      « Team » aspects
            Support of the laboratories in the process, project manager identified to manage the
             project, realism of the planning proposed and evaluation of the risks by the applicants.

• Method :
      Evaluation of the applications according to the 12 criteria
      Evaluators apply assessment scores from 1 to 3 (3 being the highest)


13 | March 2011                               Evaluation criteria
DIGITEO – « Product » criteria

 1. Originality of the innovation
       •   Originality/uniqueness in comparison with state-of-the-art ?
       •   Definition of the future « product » ?
       •   Positioning compared to competitors ?

 2. Scientific relevance of the project
       •   Compatibility with the research themes covered by Digiteo ?
       •   Scientific excellence in the field?
       •   Degree of scientific maturation ( is the technology close to a « product ») ?

 3. Project Feasibility
       •   Technical feasibility of the project?
       •   Feasibility of the planning, with regard to a transfer?
       •   Description of the transfer model envisaged (transfer to an industrial partner / creation of start-up) ?

 4. Scientific opportunities created for the laboratory
       •   Consequences of the development on the scientific activities of the lab ?
       •   Future impact of the project on the lab’s strategy ?
       •   Impact on the external communications of the lab?


14 | March 2011                                     Evaluation criteria
DIGITEO – « Market » criteria

 5. Ongoing research contracts
       •   Ongoing contracts with industrial partners?
       •   Other contracts/ scientific activities ?
       •   Since when? For how long?

 6. Intellectual property (patents, know-how)
       •   Background knowledge of the teams involved ?
       •   Protection envisaged (foreground) for the new knowledge and software derivating from it;
       •   Is an IP analysis requested by the teams (analysis of the prior art, patent landscape and « freedom
           to operate ») ?

 7. First potential applications
       •   Types/examples of applications ?
       •   Value Proposition (solution to which problem) ?
       •   Applications realised by which kind of company (software company, service provider) ?

 8. First potential users
       •   Existing and potential actors/ partners to target for the transfer?
       •   Example of end-user for the integrated solution ?
       •   Draft definition of the targeted market (size, segmentation, competitors) ?

15 | March 2011                                       Evaluation criteria
DIGITEO – « Team » criteria

 9. Support of the laboratories
       •   Support of the laboratories involved ?
       •   Balance between the teams involved in the project (complementarity, synergy) ?
       •   Commitment to a real transfer ?
 10. Project manager in charge
       •   Profile of the project manager and implication in the project ?
       •   Capacity of managing all aspects of the project, keeping with the transfer objective?
       •   Motivation to handle the 3 aspects : technical, IP, marketing ?
 11. Realism of the planning
       Realism of the planning with regards to the 3 aspects:
       •   Technical
       •   IP
       •   Marketing
 12. Evaluation/ consideration of the risks
       Identification and management of the risks :
       •   Technical
       •   IP
       •   Marketing

16 | March 2011                                     Evaluation criteria
DIGITEO - Assessment



• Useful tool to be used as a checklist throughout the evaluation process

• The final selection has to include the assessment of the presentation made
  in front of the jury. Grade given by the jury is based for 50% on written
  application and 50% on the oral presentation.

• The jury should include a majority of external experts

• Final selection : classification/ranking of the presented projects (top 5 
  selected)

• Some « Digiteo specifics » not to be considered for a generic checklist




17 | March 2011                     Evaluation criteria
Pros & Cons


                   PROs                                                CONs

 • This practice attempts to                          • Only a selected number of
     formalize methods that are already
                                                           criteria are highlighted
     in use (most of the time on an ad
     hoc basis)                                       • Some criteria may need to
                                                           be further developed
 • The methodology and associated
     tools (call for proposal, criteria,
     etc.) are readily available and can
     be adapted to each individual case




18 | March 2011                      Evaluation criteria
Rationale



• Methodology developed by Digiteo in order to manage the incoming flow of
  technology transfer proposals

• Need for a consistent set of criteria for all steps of evaluation process,
  communicated transparently to all involved partners : project teams, internal
  Digiteo evaluators, “technology transfer coaches” and external experts

• Without this methodology, involved parties would get the impression that
  projects might be evaluated/selected based on obscure reasons. This would
  leave the doors open for debate, accusations for “unfair competition” and
  backstage lobbying




19 | March 2011                      Evaluation criteria
Outcome


• The approach turned out as expected
        Final selection (with external experts) is based on relative ranking
                  among the presented projects
            The scoring system is only used for individual evaluation purposes
• Coaching the teams before the final applications, supported by the evaluation
  checklist, proved to have a great impact on the oral presentations. Final
  applications show significant improvements (in terms of technology transfer
  readiness) compared with initial applications.

• Feedbacks indicate that Digiteo community (researchers, external experts)
  judge the approach fair and clearly communicated




20 | March 2011                          Evaluation criteria
Lessons learned



 • A strict, fixed procedure can sometimes be counter-productive. For this
   reason, scoring was not made compulsory for the final evaluation and used
   as a guideline during the selection committee.
 • Special attention must be paid to the management of unsuccessful project
   teams. A debriefing session is organised with each eliminated team in
   order to:
             • debrief the teams that were not selected
             • clearly communicate the reasons for not being selected
             • focus on things to be improved (and how to improve them)
             • encourage them to apply again with an enhanced proposal



21 | March 2011                       Evaluation criteria
Plans for the future


The approach should be further developed/detailed :
      Definition of terms
      Explanation on how to apply each of the listed criteria (with some
         examples)




22 | March 2011                     Evaluation criteria
Evaluation Criteria in practice with IBBT


 IBBT wanted to use evaluation criteria for the following goals:

      scoring of new proposals at intake

      scoring & ranking of proposals within the same call

      quick visual overview of the scoring between several proposals

      evaluation or follow-up of a Bootcamp project or incubation project



 The approach:

      master list made up of all applicable evaluation criteria.

      From this master list , subset of criteria for each type of project was defined

      Validation of those criteria sets by means of a proof of concept in excel format on
       different kind of projects
23 | March 2011                         Evaluation criteria
Lessons learned


 Using evaluation criteria to objectify and structure the assessment of a project
based on different domains, different criteria and sub-criteria proved to be very helpful:
     • in panel discussions: as a common base for discussion between the members of an expert
     panel to select the most promising business ideas to go through to the coaching track of iBoot
     • in coaching trajectories: as feedback tool to the team indicating weak and strong points and
     getting constructive feedback of experts
     • in follow-up meetings: as a common base for discussion between the innovation board
     members about the progress of incubation projects

 Since assessments are made by different coaches, innovation board members and
experts, we added to the scoring of 1-4 some illustrative “quotes” . This makes the
nuances between scoring-numbers more justifiable.
     Example – Criterion - Suitability of the approach
     • Score 1: There is no connection whatsoever between the approach and the intended results
     • Score 2: The proposed approach is not very consistent with the intended innovation purpose
     • Score 3: The approach is consistent with the innovation purpose
     • Score 4: The approach is very well targeted

 24 | March 2011                            Evaluation criteria
Evaluation Criteria in practice with ICBS

 At ICBS, the evaluation criteria of Digiteo have been used for the
evaluation of the final projects of the IED programme.
The IE&D program (Innovation, Entrepreneurship & Design) is part of an MBA
course. Students embark on a live entrepreneurial journey and investigate the
commercial potential of their own idea or one being developed by Imperial
Innovations or Innovations Royal College of Art

 Modification of the practice needed
before testing:
New criteria are introduced:
   - Considerations of financials
   (economic feasibility)
   - Scalability (production & operation)


 25 | March 2011                   Evaluation criteria
Lessons learned


• Qualitative approach of the indicators, making it easy to quickly evaluate the projects.
• Useful to inform the candidates of the criteria that will be used by the jurors before the
  evaluation.
• Good tool to start a systematic evaluation of projects.
• Suggestion to add a general evaluation of the opportunity, as a preliminary step in the
  process. Analyze if there is a real problem (demand pull) or if it a technology push
  type of opportunity.
• Suggestion to include indicators on the economic feasibility of the project and on
  production/scalability. The goal is to see if the predicted production process/time is
  reasonable. The scalability can be evaluated by taking into account the time and
  money invested for the prototype. As a consequence, some of the items may be
  combined in order to introduce these new concepts.
• Re-writing some of the items would make them easier to be understood without the
  guideline (the list of concepts).


26 | March 2011                         Evaluation criteria
Evaluation Criteria in practice with CRP Henri
                       Tudor

The project definition process within Tudor has been reviewed. The goal was to
  organize the earliest phase of project development and include a screening phase
  where the project’s idea is assessed, including criteria on technology transfer.




Difference of context : criteria for pre-projects, where the project’s idea is developed,
  formalized and validated.

Objective
    •Create awareness for Technology Transfer in researcher’s mind when they write their
    project’s proposal
    •Improve assessment of project’s idea and have a referential for the entire organisation
    •Have a better follow-up project to foster maturation project and technology transfer
 27 | March 2011                        Evaluation criteria
Lessons learned



 • Putting in place a screening process has created some fears within the
   researcher community, including fear of breaking creativity. But finally,
   screening meetings are now a place of discussion where the project
   proposal can be assessed and enriched. The fact that these criteria are
   already used by peers is also a guarantee for the management of the
   organisation and researchers.
 • Important to spend time on change management, when implementing a
   new process that will have a strong impact on the organisation.
 • The outcome for Tudor is a better overview on projects developed within
   the organisation for all stakeholders and integration of the transfer
   perspective at the very beginning of projects.




28 | March 2011                    Evaluation criteria
Suggested Readings


 Link to code book

assessment; method; process; selection



 Link to related websites

OMTE call for proposal and projects selected during previous editions:
 http://www.digiteo.fr/Digiteo_OMTE




29 | March 2011                    Evaluation criteria

More Related Content

What's hot

NG BB 37 Multiple Regression
NG BB 37 Multiple RegressionNG BB 37 Multiple Regression
NG BB 37 Multiple RegressionLeanleaders.org
 
NG BB 06 Project Charter
NG BB 06 Project CharterNG BB 06 Project Charter
NG BB 06 Project CharterLeanleaders.org
 
NG BB 47 Basic Design of Experiments
NG BB 47 Basic Design of ExperimentsNG BB 47 Basic Design of Experiments
NG BB 47 Basic Design of ExperimentsLeanleaders.org
 
NG BB 19 Document and Analyze the Process
NG BB 19 Document and Analyze the ProcessNG BB 19 Document and Analyze the Process
NG BB 19 Document and Analyze the ProcessLeanleaders.org
 
ANG_AFSO21_Awareness_Training_(DULUTH)
ANG_AFSO21_Awareness_Training_(DULUTH)ANG_AFSO21_Awareness_Training_(DULUTH)
ANG_AFSO21_Awareness_Training_(DULUTH)Leanleaders.org
 
NG BB 42 Visual Management
NG BB 42 Visual ManagementNG BB 42 Visual Management
NG BB 42 Visual ManagementLeanleaders.org
 
Govind kulkarni
Govind kulkarniGovind kulkarni
Govind kulkarniNASSCOM
 
NG BB 20 Data Collection
NG BB 20 Data CollectionNG BB 20 Data Collection
NG BB 20 Data CollectionLeanleaders.org
 
NG BB 45 Quick Change Over
NG BB 45 Quick Change OverNG BB 45 Quick Change Over
NG BB 45 Quick Change OverLeanleaders.org
 
NG BB 05 Roles and Responsibilities
NG BB 05 Roles and ResponsibilitiesNG BB 05 Roles and Responsibilities
NG BB 05 Roles and ResponsibilitiesLeanleaders.org
 
NG BB 09 Project Management
NG BB 09 Project ManagementNG BB 09 Project Management
NG BB 09 Project ManagementLeanleaders.org
 
BioMedical Strategy Medical Devices Workshop Presentations
BioMedical Strategy Medical Devices Workshop PresentationsBioMedical Strategy Medical Devices Workshop Presentations
BioMedical Strategy Medical Devices Workshop PresentationsBioMedical Strategy (2004) Ltd.
 
NG BB 55 CONTROL Tollgate
NG BB 55 CONTROL TollgateNG BB 55 CONTROL Tollgate
NG BB 55 CONTROL TollgateLeanleaders.org
 
Quality Assurance
Quality AssuranceQuality Assurance
Quality AssuranceKiran Kumar
 
NG BB 28 MEASURE Tollgate
NG BB 28 MEASURE TollgateNG BB 28 MEASURE Tollgate
NG BB 28 MEASURE TollgateLeanleaders.org
 
NG BB 40 Solution Selection
NG BB 40 Solution SelectionNG BB 40 Solution Selection
NG BB 40 Solution SelectionLeanleaders.org
 
NG BB 07 Multi-Generation Project Planning
NG BB 07 Multi-Generation Project PlanningNG BB 07 Multi-Generation Project Planning
NG BB 07 Multi-Generation Project PlanningLeanleaders.org
 
NG BB 50 Rapid Improvement Event
NG BB 50 Rapid Improvement EventNG BB 50 Rapid Improvement Event
NG BB 50 Rapid Improvement EventLeanleaders.org
 

What's hot (20)

NG BB 37 Multiple Regression
NG BB 37 Multiple RegressionNG BB 37 Multiple Regression
NG BB 37 Multiple Regression
 
NG BB 06 Project Charter
NG BB 06 Project CharterNG BB 06 Project Charter
NG BB 06 Project Charter
 
NG BB 47 Basic Design of Experiments
NG BB 47 Basic Design of ExperimentsNG BB 47 Basic Design of Experiments
NG BB 47 Basic Design of Experiments
 
NG BB 19 Document and Analyze the Process
NG BB 19 Document and Analyze the ProcessNG BB 19 Document and Analyze the Process
NG BB 19 Document and Analyze the Process
 
ANG_AFSO21_Awareness_Training_(DULUTH)
ANG_AFSO21_Awareness_Training_(DULUTH)ANG_AFSO21_Awareness_Training_(DULUTH)
ANG_AFSO21_Awareness_Training_(DULUTH)
 
NG BB 42 Visual Management
NG BB 42 Visual ManagementNG BB 42 Visual Management
NG BB 42 Visual Management
 
NG BB 04 DEFINE Roadmap
NG BB 04 DEFINE RoadmapNG BB 04 DEFINE Roadmap
NG BB 04 DEFINE Roadmap
 
Govind kulkarni
Govind kulkarniGovind kulkarni
Govind kulkarni
 
NG BB 20 Data Collection
NG BB 20 Data CollectionNG BB 20 Data Collection
NG BB 20 Data Collection
 
NG BB 45 Quick Change Over
NG BB 45 Quick Change OverNG BB 45 Quick Change Over
NG BB 45 Quick Change Over
 
NG BB 05 Roles and Responsibilities
NG BB 05 Roles and ResponsibilitiesNG BB 05 Roles and Responsibilities
NG BB 05 Roles and Responsibilities
 
NG BB 09 Project Management
NG BB 09 Project ManagementNG BB 09 Project Management
NG BB 09 Project Management
 
BioMedical Strategy Medical Devices Workshop Presentations
BioMedical Strategy Medical Devices Workshop PresentationsBioMedical Strategy Medical Devices Workshop Presentations
BioMedical Strategy Medical Devices Workshop Presentations
 
NG BB 55 CONTROL Tollgate
NG BB 55 CONTROL TollgateNG BB 55 CONTROL Tollgate
NG BB 55 CONTROL Tollgate
 
Quality Assurance
Quality AssuranceQuality Assurance
Quality Assurance
 
NG BB 28 MEASURE Tollgate
NG BB 28 MEASURE TollgateNG BB 28 MEASURE Tollgate
NG BB 28 MEASURE Tollgate
 
NG BB 40 Solution Selection
NG BB 40 Solution SelectionNG BB 40 Solution Selection
NG BB 40 Solution Selection
 
NG BB 07 Multi-Generation Project Planning
NG BB 07 Multi-Generation Project PlanningNG BB 07 Multi-Generation Project Planning
NG BB 07 Multi-Generation Project Planning
 
NG BB 11 Power Steering
NG BB 11 Power SteeringNG BB 11 Power Steering
NG BB 11 Power Steering
 
NG BB 50 Rapid Improvement Event
NG BB 50 Rapid Improvement EventNG BB 50 Rapid Improvement Event
NG BB 50 Rapid Improvement Event
 

Viewers also liked

Criteria for evaluation
Criteria for evaluationCriteria for evaluation
Criteria for evaluationJuliet Cabiles
 
Training evaluation ppt 6
Training evaluation   ppt 6Training evaluation   ppt 6
Training evaluation ppt 6SBMC Jobs
 
Establishing Criteria for Evaluation
Establishing Criteria for EvaluationEstablishing Criteria for Evaluation
Establishing Criteria for EvaluationJodie Nicotra
 
ENGL 202 Project 3 Schedule
ENGL 202 Project 3 ScheduleENGL 202 Project 3 Schedule
ENGL 202 Project 3 ScheduleJodie Nicotra
 
ETL tool evaluation criteria
ETL tool evaluation criteriaETL tool evaluation criteria
ETL tool evaluation criteriaAsis Mohanty
 
Information Technology Security Techniques Evaluation Criteria For It Secrit...
Information Technology  Security Techniques Evaluation Criteria For It Secrit...Information Technology  Security Techniques Evaluation Criteria For It Secrit...
Information Technology Security Techniques Evaluation Criteria For It Secrit...Vishnu Kesarwani
 
Evaluation criteria
Evaluation criteriaEvaluation criteria
Evaluation criteriaCarr Tamara
 
IIT Academy: 204 User stories and acceptance criteria
IIT Academy: 204 User stories and acceptance criteriaIIT Academy: 204 User stories and acceptance criteria
IIT Academy: 204 User stories and acceptance criteriaSteven HK Ma | 馬國豪
 
7-point checklist that every Project Manager needs to look at
7-point checklist that every Project Manager needs to look at 7-point checklist that every Project Manager needs to look at
7-point checklist that every Project Manager needs to look at ProofHub
 
G321 Marking Criteria
G321 Marking CriteriaG321 Marking Criteria
G321 Marking Criteriajfoster10
 
Business Idea Brainstorming and Evaluation Techniques
Business Idea Brainstorming and Evaluation TechniquesBusiness Idea Brainstorming and Evaluation Techniques
Business Idea Brainstorming and Evaluation TechniquesMark Tayar
 
Acceptance Criteria with SpecFlow
Acceptance Criteria with SpecFlowAcceptance Criteria with SpecFlow
Acceptance Criteria with SpecFlowMarcin Floryan
 
Brainstorming techniques
Brainstorming techniquesBrainstorming techniques
Brainstorming techniquesAmin Hanif
 
Energy Efficiency financing & evaluation criteria
Energy Efficiency  financing & evaluation criteriaEnergy Efficiency  financing & evaluation criteria
Energy Efficiency financing & evaluation criteriaZAINI ABDUL WAHAB
 

Viewers also liked (20)

Criteria for evaluation
Criteria for evaluationCriteria for evaluation
Criteria for evaluation
 
Evaluation ppt
Evaluation pptEvaluation ppt
Evaluation ppt
 
Training evaluation ppt 6
Training evaluation   ppt 6Training evaluation   ppt 6
Training evaluation ppt 6
 
Establishing Criteria for Evaluation
Establishing Criteria for EvaluationEstablishing Criteria for Evaluation
Establishing Criteria for Evaluation
 
ENGL 202 Project 3 Schedule
ENGL 202 Project 3 ScheduleENGL 202 Project 3 Schedule
ENGL 202 Project 3 Schedule
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
ETL tool evaluation criteria
ETL tool evaluation criteriaETL tool evaluation criteria
ETL tool evaluation criteria
 
Information Technology Security Techniques Evaluation Criteria For It Secrit...
Information Technology  Security Techniques Evaluation Criteria For It Secrit...Information Technology  Security Techniques Evaluation Criteria For It Secrit...
Information Technology Security Techniques Evaluation Criteria For It Secrit...
 
Evaluation criteria
Evaluation criteriaEvaluation criteria
Evaluation criteria
 
Prince 2, project managment Document Acceptance criteria
Prince 2, project managment Document Acceptance criteriaPrince 2, project managment Document Acceptance criteria
Prince 2, project managment Document Acceptance criteria
 
IIT Academy: 204 User stories and acceptance criteria
IIT Academy: 204 User stories and acceptance criteriaIIT Academy: 204 User stories and acceptance criteria
IIT Academy: 204 User stories and acceptance criteria
 
7-point checklist that every Project Manager needs to look at
7-point checklist that every Project Manager needs to look at 7-point checklist that every Project Manager needs to look at
7-point checklist that every Project Manager needs to look at
 
G321 Marking Criteria
G321 Marking CriteriaG321 Marking Criteria
G321 Marking Criteria
 
User Stories
User StoriesUser Stories
User Stories
 
Business Idea Brainstorming and Evaluation Techniques
Business Idea Brainstorming and Evaluation TechniquesBusiness Idea Brainstorming and Evaluation Techniques
Business Idea Brainstorming and Evaluation Techniques
 
Evaluation process and criteria june 2 2015 final
Evaluation process and criteria june 2 2015 finalEvaluation process and criteria june 2 2015 final
Evaluation process and criteria june 2 2015 final
 
Relief Line - Draft Evaluation Process and Criteria
Relief Line - Draft Evaluation Process and CriteriaRelief Line - Draft Evaluation Process and Criteria
Relief Line - Draft Evaluation Process and Criteria
 
Acceptance Criteria with SpecFlow
Acceptance Criteria with SpecFlowAcceptance Criteria with SpecFlow
Acceptance Criteria with SpecFlow
 
Brainstorming techniques
Brainstorming techniquesBrainstorming techniques
Brainstorming techniques
 
Energy Efficiency financing & evaluation criteria
Energy Efficiency  financing & evaluation criteriaEnergy Efficiency  financing & evaluation criteria
Energy Efficiency financing & evaluation criteria
 

Similar to FITT Toolbox: Evaluation Criteria

Process performance models case study
Process performance models case studyProcess performance models case study
Process performance models case studyKobi Vider
 
FITT Toolbox: Evaluation of Transfer Projects
FITT Toolbox: Evaluation of Transfer ProjectsFITT Toolbox: Evaluation of Transfer Projects
FITT Toolbox: Evaluation of Transfer ProjectsFITT
 
Agile Requirements
Agile RequirementsAgile Requirements
Agile RequirementsBen Linders
 
Phase gate, 5 s lean manufacturing
Phase gate, 5 s lean manufacturingPhase gate, 5 s lean manufacturing
Phase gate, 5 s lean manufacturingUdo Dittmar
 
Product QA - A test engineering perspective
Product QA - A test engineering perspectiveProduct QA - A test engineering perspective
Product QA - A test engineering perspectiveImaginea
 
OpenERP - Project Methodology
OpenERP - Project MethodologyOpenERP - Project Methodology
OpenERP - Project MethodologyOdoo
 
Estimator Metrics STC 2009
Estimator Metrics STC 2009Estimator Metrics STC 2009
Estimator Metrics STC 2009Amit Bhardwaj
 
Process Evaluation Of Transfer Projects Ppt Final
Process Evaluation Of Transfer Projects Ppt FinalProcess Evaluation Of Transfer Projects Ppt Final
Process Evaluation Of Transfer Projects Ppt FinalFITT
 
Configuration management
Configuration managementConfiguration management
Configuration managementKobi Vider
 
Process Guidelines V2
Process Guidelines V2Process Guidelines V2
Process Guidelines V2Imaginea
 
PDMA Process StageGate
PDMA Process StageGatePDMA Process StageGate
PDMA Process StageGateDeepSmarts
 
Doe Taguchi Basic Manual1
Doe Taguchi Basic Manual1Doe Taguchi Basic Manual1
Doe Taguchi Basic Manual1nazeer pasha
 
OpenERP- Partner First Project Support
OpenERP- Partner First Project SupportOpenERP- Partner First Project Support
OpenERP- Partner First Project SupportOdoo
 
Cox Automotive: Testing Across Multiple Brands
Cox Automotive: Testing Across Multiple BrandsCox Automotive: Testing Across Multiple Brands
Cox Automotive: Testing Across Multiple BrandsOptimizely
 
Expedition Innovation Conference: Innovation Places, Spaces And Methods
Expedition Innovation Conference: Innovation Places, Spaces And MethodsExpedition Innovation Conference: Innovation Places, Spaces And Methods
Expedition Innovation Conference: Innovation Places, Spaces And Methodsagchute
 
NG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapNG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapLeanleaders.org
 

Similar to FITT Toolbox: Evaluation Criteria (20)

Process performance models case study
Process performance models case studyProcess performance models case study
Process performance models case study
 
FITT Toolbox: Evaluation of Transfer Projects
FITT Toolbox: Evaluation of Transfer ProjectsFITT Toolbox: Evaluation of Transfer Projects
FITT Toolbox: Evaluation of Transfer Projects
 
Sop test planning
Sop test planningSop test planning
Sop test planning
 
Agile Requirements
Agile RequirementsAgile Requirements
Agile Requirements
 
Phase gate, 5 s lean manufacturing
Phase gate, 5 s lean manufacturingPhase gate, 5 s lean manufacturing
Phase gate, 5 s lean manufacturing
 
Product QA - A test engineering perspective
Product QA - A test engineering perspectiveProduct QA - A test engineering perspective
Product QA - A test engineering perspective
 
OpenERP - Project Methodology
OpenERP - Project MethodologyOpenERP - Project Methodology
OpenERP - Project Methodology
 
NPD- Stage Gate Presentation
NPD- Stage Gate PresentationNPD- Stage Gate Presentation
NPD- Stage Gate Presentation
 
Estimator Metrics STC 2009
Estimator Metrics STC 2009Estimator Metrics STC 2009
Estimator Metrics STC 2009
 
chapter 7.ppt
chapter 7.pptchapter 7.ppt
chapter 7.ppt
 
Sanitized tb swstmppp1516july
Sanitized tb swstmppp1516julySanitized tb swstmppp1516july
Sanitized tb swstmppp1516july
 
Process Evaluation Of Transfer Projects Ppt Final
Process Evaluation Of Transfer Projects Ppt FinalProcess Evaluation Of Transfer Projects Ppt Final
Process Evaluation Of Transfer Projects Ppt Final
 
Configuration management
Configuration managementConfiguration management
Configuration management
 
Process Guidelines V2
Process Guidelines V2Process Guidelines V2
Process Guidelines V2
 
PDMA Process StageGate
PDMA Process StageGatePDMA Process StageGate
PDMA Process StageGate
 
Doe Taguchi Basic Manual1
Doe Taguchi Basic Manual1Doe Taguchi Basic Manual1
Doe Taguchi Basic Manual1
 
OpenERP- Partner First Project Support
OpenERP- Partner First Project SupportOpenERP- Partner First Project Support
OpenERP- Partner First Project Support
 
Cox Automotive: Testing Across Multiple Brands
Cox Automotive: Testing Across Multiple BrandsCox Automotive: Testing Across Multiple Brands
Cox Automotive: Testing Across Multiple Brands
 
Expedition Innovation Conference: Innovation Places, Spaces And Methods
Expedition Innovation Conference: Innovation Places, Spaces And MethodsExpedition Innovation Conference: Innovation Places, Spaces And Methods
Expedition Innovation Conference: Innovation Places, Spaces And Methods
 
NG BB 15 MEASURE Roadmap
NG BB 15 MEASURE RoadmapNG BB 15 MEASURE Roadmap
NG BB 15 MEASURE Roadmap
 

More from FITT

Prof. Thomas Baaken:Science-to-Business Marketing - A new Model in Knowledge ...
Prof. Thomas Baaken:Science-to-Business Marketing - A new Model in Knowledge ...Prof. Thomas Baaken:Science-to-Business Marketing - A new Model in Knowledge ...
Prof. Thomas Baaken:Science-to-Business Marketing - A new Model in Knowledge ...FITT
 
Mario Cameron: Turning Science into Business: From Research to Market – the E...
Mario Cameron: Turning Science into Business: From Research to Market – the E...Mario Cameron: Turning Science into Business: From Research to Market – the E...
Mario Cameron: Turning Science into Business: From Research to Market – the E...FITT
 
FITT Toolbox: Networking & Clustering
FITT Toolbox: Networking & ClusteringFITT Toolbox: Networking & Clustering
FITT Toolbox: Networking & ClusteringFITT
 
FITT Toolbox: Network Management Scorecards
FITT Toolbox: Network Management ScorecardsFITT Toolbox: Network Management Scorecards
FITT Toolbox: Network Management ScorecardsFITT
 
FITT Toolbox: Research meets Business
FITT Toolbox: Research meets BusinessFITT Toolbox: Research meets Business
FITT Toolbox: Research meets BusinessFITT
 
FITT Toolbox: Cluster Manual
FITT Toolbox: Cluster ManualFITT Toolbox: Cluster Manual
FITT Toolbox: Cluster ManualFITT
 
FITT Toolbox: Cluster Management Scorecard
FITT Toolbox: Cluster Management ScorecardFITT Toolbox: Cluster Management Scorecard
FITT Toolbox: Cluster Management ScorecardFITT
 
FITT Toolbox: Network Management
FITT Toolbox: Network ManagementFITT Toolbox: Network Management
FITT Toolbox: Network ManagementFITT
 
FITT Toolbox: Technology Transfer & Web 2.0
FITT Toolbox: Technology Transfer & Web 2.0FITT Toolbox: Technology Transfer & Web 2.0
FITT Toolbox: Technology Transfer & Web 2.0FITT
 
FITT Toolbox: International Technology Transfer Networks
FITT Toolbox: International Technology Transfer NetworksFITT Toolbox: International Technology Transfer Networks
FITT Toolbox: International Technology Transfer NetworksFITT
 
FITT Toolbox: Cluster Collaboration Platform
FITT Toolbox: Cluster Collaboration PlatformFITT Toolbox: Cluster Collaboration Platform
FITT Toolbox: Cluster Collaboration PlatformFITT
 
FITT Toolbox: Network Support Services
FITT Toolbox: Network Support ServicesFITT Toolbox: Network Support Services
FITT Toolbox: Network Support ServicesFITT
 
FITT Toolbox: Technology Transfer (TT) Collaboration
FITT Toolbox: Technology Transfer (TT) CollaborationFITT Toolbox: Technology Transfer (TT) Collaboration
FITT Toolbox: Technology Transfer (TT) CollaborationFITT
 
FITT Toolbox: Open Source Business Model - Geosparc
FITT Toolbox: Open Source Business Model - GeosparcFITT Toolbox: Open Source Business Model - Geosparc
FITT Toolbox: Open Source Business Model - GeosparcFITT
 
FITT Toolbox: Market Assessment: Pitch your Idea!
FITT Toolbox: Market Assessment: Pitch your Idea!FITT Toolbox: Market Assessment: Pitch your Idea!
FITT Toolbox: Market Assessment: Pitch your Idea!FITT
 
FITT Toolbox: How to manage Uncertainty in Business Strategy
FITT Toolbox: How to manage Uncertainty in Business StrategyFITT Toolbox: How to manage Uncertainty in Business Strategy
FITT Toolbox: How to manage Uncertainty in Business StrategyFITT
 
FITT Toolbox: Open Source Business Model
FITT Toolbox: Open Source Business ModelFITT Toolbox: Open Source Business Model
FITT Toolbox: Open Source Business ModelFITT
 
FITT Toolbox: Open Source Business Model
FITT Toolbox: Open Source Business ModelFITT Toolbox: Open Source Business Model
FITT Toolbox: Open Source Business ModelFITT
 
Dr. Carolina Garcia Rizo: Commercializing Innovative Technologies: The US Per...
Dr. Carolina Garcia Rizo: Commercializing Innovative Technologies: The US Per...Dr. Carolina Garcia Rizo: Commercializing Innovative Technologies: The US Per...
Dr. Carolina Garcia Rizo: Commercializing Innovative Technologies: The US Per...FITT
 
FITT Toolbox: Business Model Design
FITT Toolbox: Business Model DesignFITT Toolbox: Business Model Design
FITT Toolbox: Business Model DesignFITT
 

More from FITT (20)

Prof. Thomas Baaken:Science-to-Business Marketing - A new Model in Knowledge ...
Prof. Thomas Baaken:Science-to-Business Marketing - A new Model in Knowledge ...Prof. Thomas Baaken:Science-to-Business Marketing - A new Model in Knowledge ...
Prof. Thomas Baaken:Science-to-Business Marketing - A new Model in Knowledge ...
 
Mario Cameron: Turning Science into Business: From Research to Market – the E...
Mario Cameron: Turning Science into Business: From Research to Market – the E...Mario Cameron: Turning Science into Business: From Research to Market – the E...
Mario Cameron: Turning Science into Business: From Research to Market – the E...
 
FITT Toolbox: Networking & Clustering
FITT Toolbox: Networking & ClusteringFITT Toolbox: Networking & Clustering
FITT Toolbox: Networking & Clustering
 
FITT Toolbox: Network Management Scorecards
FITT Toolbox: Network Management ScorecardsFITT Toolbox: Network Management Scorecards
FITT Toolbox: Network Management Scorecards
 
FITT Toolbox: Research meets Business
FITT Toolbox: Research meets BusinessFITT Toolbox: Research meets Business
FITT Toolbox: Research meets Business
 
FITT Toolbox: Cluster Manual
FITT Toolbox: Cluster ManualFITT Toolbox: Cluster Manual
FITT Toolbox: Cluster Manual
 
FITT Toolbox: Cluster Management Scorecard
FITT Toolbox: Cluster Management ScorecardFITT Toolbox: Cluster Management Scorecard
FITT Toolbox: Cluster Management Scorecard
 
FITT Toolbox: Network Management
FITT Toolbox: Network ManagementFITT Toolbox: Network Management
FITT Toolbox: Network Management
 
FITT Toolbox: Technology Transfer & Web 2.0
FITT Toolbox: Technology Transfer & Web 2.0FITT Toolbox: Technology Transfer & Web 2.0
FITT Toolbox: Technology Transfer & Web 2.0
 
FITT Toolbox: International Technology Transfer Networks
FITT Toolbox: International Technology Transfer NetworksFITT Toolbox: International Technology Transfer Networks
FITT Toolbox: International Technology Transfer Networks
 
FITT Toolbox: Cluster Collaboration Platform
FITT Toolbox: Cluster Collaboration PlatformFITT Toolbox: Cluster Collaboration Platform
FITT Toolbox: Cluster Collaboration Platform
 
FITT Toolbox: Network Support Services
FITT Toolbox: Network Support ServicesFITT Toolbox: Network Support Services
FITT Toolbox: Network Support Services
 
FITT Toolbox: Technology Transfer (TT) Collaboration
FITT Toolbox: Technology Transfer (TT) CollaborationFITT Toolbox: Technology Transfer (TT) Collaboration
FITT Toolbox: Technology Transfer (TT) Collaboration
 
FITT Toolbox: Open Source Business Model - Geosparc
FITT Toolbox: Open Source Business Model - GeosparcFITT Toolbox: Open Source Business Model - Geosparc
FITT Toolbox: Open Source Business Model - Geosparc
 
FITT Toolbox: Market Assessment: Pitch your Idea!
FITT Toolbox: Market Assessment: Pitch your Idea!FITT Toolbox: Market Assessment: Pitch your Idea!
FITT Toolbox: Market Assessment: Pitch your Idea!
 
FITT Toolbox: How to manage Uncertainty in Business Strategy
FITT Toolbox: How to manage Uncertainty in Business StrategyFITT Toolbox: How to manage Uncertainty in Business Strategy
FITT Toolbox: How to manage Uncertainty in Business Strategy
 
FITT Toolbox: Open Source Business Model
FITT Toolbox: Open Source Business ModelFITT Toolbox: Open Source Business Model
FITT Toolbox: Open Source Business Model
 
FITT Toolbox: Open Source Business Model
FITT Toolbox: Open Source Business ModelFITT Toolbox: Open Source Business Model
FITT Toolbox: Open Source Business Model
 
Dr. Carolina Garcia Rizo: Commercializing Innovative Technologies: The US Per...
Dr. Carolina Garcia Rizo: Commercializing Innovative Technologies: The US Per...Dr. Carolina Garcia Rizo: Commercializing Innovative Technologies: The US Per...
Dr. Carolina Garcia Rizo: Commercializing Innovative Technologies: The US Per...
 
FITT Toolbox: Business Model Design
FITT Toolbox: Business Model DesignFITT Toolbox: Business Model Design
FITT Toolbox: Business Model Design
 

Recently uploaded

APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDGMarianaLemus7
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentationphoebematthew05
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Science&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfScience&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfjimielynbastida
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 

Recently uploaded (20)

APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDG
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentation
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Science&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfScience&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdf
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 

FITT Toolbox: Evaluation Criteria

  • 1. Evaluation Criteria FITT – Fostering Interregional Exchange in ICT Technology Transfer – www.FITT-for-Innovation.eu Except where otherwise noted, this work is licensed under a Creative Commons Attribution 3.0 License.
  • 2. Criteria for evaluation of transfer projects • This practice is designed to assist in the preliminary assessment of research- grounded technology projects for their commercialization potential in the realm of technology transfer. • The process of assessing research projects is necessitated by the high failure rate, and resulting high cost, of technologies either prior to reaching the market or once in the market. • The Evaluation Criteria are intended to provide guidance for assessing an idea, a technology or a research project, at an early-stage of technology transfer (thus prior to product development). 2 | March 2011 Evaluation criteria
  • 3. The evaluation process • Project evaluation may take place at various stages  Early-stage (proof-of-concept “maturation” towards technology transfer)  Pre-incubation  Incubation • Our focus is Early Stage Project Evaluation, which may appear  In a continuous manner (or at regular intervals)  Based on a CFP (Call For Proposal, typically once per year) • Such early stage evaluation covers :  Evaluation criteria  A process for the application of these criteria, including the structure/organization of the evaluation committee  The current practice focuses on recommended Evaluation Criteria 3 | March 2011 Evaluation criteria
  • 4. Illustration of the evaluation process Incubation entry evaluation Pre-incubation entry Early-stage eval evaluation Research Development Proof-of-concept Market Process : Evaluation criteria Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa • Description of project to be evaluated (document) Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee • Evaluation criteria Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd EeeeeBbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee • Jury (evaluation committee) Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee Aaaaaaaaa Bbbbbbbbbb Bbbbbbbbbb Cccccc Ddddd Eeeee 4 | March 2011 Evaluation criteria
  • 5. Coverage/definition • Evaluation criteria should cover three main aspects of a project:  Technical aspects  Market assessment  Team considerations • They should be defined and published in advance in order to allow the evaluated teams to adapt to the process • They will be used to establish the overall process, evaluation documents and the selection committee • Evaluation criteria may be used by the evaluation committee to  Allocate funds/resources to selected projects  Provide consultancy to the project team (for example, to coach the team on aspects considered as “weak”) 5 | March 2011 Evaluation criteria
  • 6. Evaluation criteria  Possible evaluation criteria • Lots of possible evaluation methods/criteria are mentioned in the literature • Several possible groups of criteria : Originality of the innovation Profile of the inventor Positive ROI/NPV estimations Scientific return/opportunities for the Business opportunity Venture value laboratory Project feasibility Market opportunities/threats Regulatory constraints Potential users IP (protection issues, prior art) Business model Scientific relevance of the project Lab support Financial return Team aspects Realism of the announced plan Social & economical impact Risk management Potential applications Production issues 6 | March 2011 Evaluation criteria
  • 7. Focus on first-stage evaluation criteria  Most important criteria for first-stage evaluation Positive ROI/NPV calculations Originality of the innovation Profile of the inventor Venture value Scientific return/opportunities Business opportunity for the laboratory Regulatory constraints Project feasibility Market opportunities/threats Business model Potential users IP (protection issues, prior art) Financial return Scientific relevance of the Lab support project Social & economical impact Team aspects Realism of the announced plan Production issues Risk management Potential applications Deemed premature for the evaluation early-stage technology transfer projects (prior to product development). 7 | March 2011 Evaluation criteria
  • 8. The DIGITEO example - Global positioning The OMTE checklist is used for maturation projects : 8 | March 2011 Evaluation criteria
  • 9. Timing of the annual call for proposal → March : launch of call for proposal/deadline for submissions Long → April : preselection of 10 projects selection → May: coaching by Digiteo’s marketing team process → June/July : final application, oral présentation, deliberation, final decision → September : launch of proof-of-concept activities for the projects selected Process of Digiteo’s maturation programme 9 | March 2011 Evaluation criteria
  • 10. From proposal to selection • ~ 10 proposals per year • Preselection classification performed by Digiteo’s scientific committee and marketing staff • Coaching : work on the three components technology/marketing/IP of the proposal and submit presentation for the final selection • Selection process :  External experts (technology transfer specialists from : industry cluster, incubator, Paris region, OSEO innovation fund, chamber of commerce, etc.)  Digiteo’s technology transfer committee  Formal selection announced by Digiteo’s steering committee • Average of 5 projects selected per year 10 | March 2011 Evaluation criteria
  • 11. Selection steps 4. Final decision DIGITEO’s Steering Committee Technology Scientific Transfer Committee Expert Panel Committee 1. Scientific relevance 2. TT potential 3. Recommandations Technical differentiation Value creation 11 | March 2011 Evaluation criteria
  • 12. Digiteo’s evaluation checklist 12 | March 2011 Evaluation criteria
  • 13. DIGITEO – Method/criteria • Evaluation criteria used for the OMTE call for projects :  « Product/technology » aspects  Originality/uniqueness and scientific relevance, project feasibility and opportunities created for the laboratory.  « Market » aspects  Ongoing research contracts and IP related to the project, first applications and users considered.  « Team » aspects  Support of the laboratories in the process, project manager identified to manage the project, realism of the planning proposed and evaluation of the risks by the applicants. • Method :  Evaluation of the applications according to the 12 criteria  Evaluators apply assessment scores from 1 to 3 (3 being the highest) 13 | March 2011 Evaluation criteria
  • 14. DIGITEO – « Product » criteria 1. Originality of the innovation • Originality/uniqueness in comparison with state-of-the-art ? • Definition of the future « product » ? • Positioning compared to competitors ? 2. Scientific relevance of the project • Compatibility with the research themes covered by Digiteo ? • Scientific excellence in the field? • Degree of scientific maturation ( is the technology close to a « product ») ? 3. Project Feasibility • Technical feasibility of the project? • Feasibility of the planning, with regard to a transfer? • Description of the transfer model envisaged (transfer to an industrial partner / creation of start-up) ? 4. Scientific opportunities created for the laboratory • Consequences of the development on the scientific activities of the lab ? • Future impact of the project on the lab’s strategy ? • Impact on the external communications of the lab? 14 | March 2011 Evaluation criteria
  • 15. DIGITEO – « Market » criteria 5. Ongoing research contracts • Ongoing contracts with industrial partners? • Other contracts/ scientific activities ? • Since when? For how long? 6. Intellectual property (patents, know-how) • Background knowledge of the teams involved ? • Protection envisaged (foreground) for the new knowledge and software derivating from it; • Is an IP analysis requested by the teams (analysis of the prior art, patent landscape and « freedom to operate ») ? 7. First potential applications • Types/examples of applications ? • Value Proposition (solution to which problem) ? • Applications realised by which kind of company (software company, service provider) ? 8. First potential users • Existing and potential actors/ partners to target for the transfer? • Example of end-user for the integrated solution ? • Draft definition of the targeted market (size, segmentation, competitors) ? 15 | March 2011 Evaluation criteria
  • 16. DIGITEO – « Team » criteria 9. Support of the laboratories • Support of the laboratories involved ? • Balance between the teams involved in the project (complementarity, synergy) ? • Commitment to a real transfer ? 10. Project manager in charge • Profile of the project manager and implication in the project ? • Capacity of managing all aspects of the project, keeping with the transfer objective? • Motivation to handle the 3 aspects : technical, IP, marketing ? 11. Realism of the planning Realism of the planning with regards to the 3 aspects: • Technical • IP • Marketing 12. Evaluation/ consideration of the risks Identification and management of the risks : • Technical • IP • Marketing 16 | March 2011 Evaluation criteria
  • 17. DIGITEO - Assessment • Useful tool to be used as a checklist throughout the evaluation process • The final selection has to include the assessment of the presentation made in front of the jury. Grade given by the jury is based for 50% on written application and 50% on the oral presentation. • The jury should include a majority of external experts • Final selection : classification/ranking of the presented projects (top 5  selected) • Some « Digiteo specifics » not to be considered for a generic checklist 17 | March 2011 Evaluation criteria
  • 18. Pros & Cons PROs CONs • This practice attempts to • Only a selected number of formalize methods that are already criteria are highlighted in use (most of the time on an ad hoc basis) • Some criteria may need to be further developed • The methodology and associated tools (call for proposal, criteria, etc.) are readily available and can be adapted to each individual case 18 | March 2011 Evaluation criteria
  • 19. Rationale • Methodology developed by Digiteo in order to manage the incoming flow of technology transfer proposals • Need for a consistent set of criteria for all steps of evaluation process, communicated transparently to all involved partners : project teams, internal Digiteo evaluators, “technology transfer coaches” and external experts • Without this methodology, involved parties would get the impression that projects might be evaluated/selected based on obscure reasons. This would leave the doors open for debate, accusations for “unfair competition” and backstage lobbying 19 | March 2011 Evaluation criteria
  • 20. Outcome • The approach turned out as expected  Final selection (with external experts) is based on relative ranking among the presented projects  The scoring system is only used for individual evaluation purposes • Coaching the teams before the final applications, supported by the evaluation checklist, proved to have a great impact on the oral presentations. Final applications show significant improvements (in terms of technology transfer readiness) compared with initial applications. • Feedbacks indicate that Digiteo community (researchers, external experts) judge the approach fair and clearly communicated 20 | March 2011 Evaluation criteria
  • 21. Lessons learned • A strict, fixed procedure can sometimes be counter-productive. For this reason, scoring was not made compulsory for the final evaluation and used as a guideline during the selection committee. • Special attention must be paid to the management of unsuccessful project teams. A debriefing session is organised with each eliminated team in order to: • debrief the teams that were not selected • clearly communicate the reasons for not being selected • focus on things to be improved (and how to improve them) • encourage them to apply again with an enhanced proposal 21 | March 2011 Evaluation criteria
  • 22. Plans for the future The approach should be further developed/detailed :  Definition of terms  Explanation on how to apply each of the listed criteria (with some examples) 22 | March 2011 Evaluation criteria
  • 23. Evaluation Criteria in practice with IBBT  IBBT wanted to use evaluation criteria for the following goals:  scoring of new proposals at intake  scoring & ranking of proposals within the same call  quick visual overview of the scoring between several proposals  evaluation or follow-up of a Bootcamp project or incubation project  The approach:  master list made up of all applicable evaluation criteria.  From this master list , subset of criteria for each type of project was defined  Validation of those criteria sets by means of a proof of concept in excel format on different kind of projects 23 | March 2011 Evaluation criteria
  • 24. Lessons learned  Using evaluation criteria to objectify and structure the assessment of a project based on different domains, different criteria and sub-criteria proved to be very helpful: • in panel discussions: as a common base for discussion between the members of an expert panel to select the most promising business ideas to go through to the coaching track of iBoot • in coaching trajectories: as feedback tool to the team indicating weak and strong points and getting constructive feedback of experts • in follow-up meetings: as a common base for discussion between the innovation board members about the progress of incubation projects  Since assessments are made by different coaches, innovation board members and experts, we added to the scoring of 1-4 some illustrative “quotes” . This makes the nuances between scoring-numbers more justifiable. Example – Criterion - Suitability of the approach • Score 1: There is no connection whatsoever between the approach and the intended results • Score 2: The proposed approach is not very consistent with the intended innovation purpose • Score 3: The approach is consistent with the innovation purpose • Score 4: The approach is very well targeted 24 | March 2011 Evaluation criteria
  • 25. Evaluation Criteria in practice with ICBS  At ICBS, the evaluation criteria of Digiteo have been used for the evaluation of the final projects of the IED programme. The IE&D program (Innovation, Entrepreneurship & Design) is part of an MBA course. Students embark on a live entrepreneurial journey and investigate the commercial potential of their own idea or one being developed by Imperial Innovations or Innovations Royal College of Art  Modification of the practice needed before testing: New criteria are introduced: - Considerations of financials (economic feasibility) - Scalability (production & operation) 25 | March 2011 Evaluation criteria
  • 26. Lessons learned • Qualitative approach of the indicators, making it easy to quickly evaluate the projects. • Useful to inform the candidates of the criteria that will be used by the jurors before the evaluation. • Good tool to start a systematic evaluation of projects. • Suggestion to add a general evaluation of the opportunity, as a preliminary step in the process. Analyze if there is a real problem (demand pull) or if it a technology push type of opportunity. • Suggestion to include indicators on the economic feasibility of the project and on production/scalability. The goal is to see if the predicted production process/time is reasonable. The scalability can be evaluated by taking into account the time and money invested for the prototype. As a consequence, some of the items may be combined in order to introduce these new concepts. • Re-writing some of the items would make them easier to be understood without the guideline (the list of concepts). 26 | March 2011 Evaluation criteria
  • 27. Evaluation Criteria in practice with CRP Henri Tudor The project definition process within Tudor has been reviewed. The goal was to organize the earliest phase of project development and include a screening phase where the project’s idea is assessed, including criteria on technology transfer. Difference of context : criteria for pre-projects, where the project’s idea is developed, formalized and validated. Objective •Create awareness for Technology Transfer in researcher’s mind when they write their project’s proposal •Improve assessment of project’s idea and have a referential for the entire organisation •Have a better follow-up project to foster maturation project and technology transfer 27 | March 2011 Evaluation criteria
  • 28. Lessons learned • Putting in place a screening process has created some fears within the researcher community, including fear of breaking creativity. But finally, screening meetings are now a place of discussion where the project proposal can be assessed and enriched. The fact that these criteria are already used by peers is also a guarantee for the management of the organisation and researchers. • Important to spend time on change management, when implementing a new process that will have a strong impact on the organisation. • The outcome for Tudor is a better overview on projects developed within the organisation for all stakeholders and integration of the transfer perspective at the very beginning of projects. 28 | March 2011 Evaluation criteria
  • 29. Suggested Readings  Link to code book assessment; method; process; selection  Link to related websites OMTE call for proposal and projects selected during previous editions: http://www.digiteo.fr/Digiteo_OMTE 29 | March 2011 Evaluation criteria