Assessing the commercialization potential of research-grounded technology projects is necessitated by the high failure rate and resulting in high cost of technologies either prior to reaching the market or once in the market. As a result, technology transfer offices (TTO) resort to preliminary assessments to get a first idea of the technologies’ commercial potential and select the most promising ones in case of limited resources. A set of criteria to perform such evaluations is provided here, which can be used by the TTO either in a continuous manner or through punctual calls for proposal.
www.FITT-for-Innovation.eu
1. Evaluation Criteria
FITT
– Fostering Interregional Exchange in ICT Technology Transfer –
www.FITT-for-Innovation.eu
Except where otherwise noted, this work is licensed under a Creative Commons Attribution 3.0 License.
2. Criteria for evaluation of transfer projects
• This practice is designed to assist in the preliminary assessment of research-
grounded technology projects for their commercialization potential in the realm
of technology transfer.
• The process of assessing research projects is necessitated by the high failure
rate, and resulting high cost, of technologies either prior to reaching the
market or once in the market.
• The Evaluation Criteria are intended to provide guidance for assessing an
idea, a technology or a research project, at an early-stage of technology
transfer (thus prior to product development).
2 | March 2011 Evaluation criteria
3. The evaluation process
• Project evaluation may take place at various stages
Early-stage (proof-of-concept “maturation” towards technology transfer)
Pre-incubation Incubation
• Our focus is Early Stage Project Evaluation, which may appear
In a continuous manner (or at regular intervals)
Based on a CFP (Call For Proposal, typically once per year)
• Such early stage evaluation covers :
Evaluation criteria
A process for the application of these criteria, including the structure/organization
of the evaluation committee
The current practice focuses on recommended Evaluation
Criteria
3 | March 2011 Evaluation criteria
5. Coverage/definition
• Evaluation criteria should cover three main aspects of a project:
Technical aspects
Market assessment
Team considerations
• They should be defined and published in advance in order to allow the
evaluated teams to adapt to the process
• They will be used to establish the overall process, evaluation documents and
the selection committee
• Evaluation criteria may be used by the evaluation committee to
Allocate funds/resources to selected projects
Provide consultancy to the project team (for example, to coach the team on
aspects considered as “weak”)
5 | March 2011 Evaluation criteria
6. Evaluation criteria
Possible evaluation criteria
• Lots of possible evaluation methods/criteria are mentioned
in the literature
• Several possible groups of criteria :
Originality of the innovation Profile of the inventor Positive ROI/NPV estimations
Scientific return/opportunities for the Business opportunity Venture value
laboratory
Project feasibility Market opportunities/threats Regulatory constraints
Potential users IP (protection issues, prior art) Business model
Scientific relevance of the project Lab support Financial return
Team aspects Realism of the announced plan Social & economical impact
Risk management Potential applications Production issues
6 | March 2011 Evaluation criteria
7. Focus on first-stage evaluation criteria
Most important criteria for first-stage evaluation
Positive ROI/NPV calculations
Originality of the innovation Profile of the inventor
Venture value
Scientific return/opportunities Business opportunity
for the laboratory
Regulatory constraints
Project feasibility Market opportunities/threats
Business model
Potential users IP (protection issues, prior art)
Financial return
Scientific relevance of the Lab support
project
Social & economical impact
Team aspects Realism of the announced plan
Production issues
Risk management Potential applications
Deemed premature for the evaluation early-stage technology
transfer projects (prior to product development).
7 | March 2011 Evaluation criteria
8. The DIGITEO example - Global positioning
The OMTE checklist is used for maturation projects :
8 | March 2011 Evaluation criteria
9. Timing of the annual call for proposal
→ March : launch of call for proposal/deadline for
submissions
Long → April : preselection of 10 projects
selection
→ May: coaching by Digiteo’s marketing team
process
→ June/July : final application, oral présentation,
deliberation, final decision
→ September : launch of proof-of-concept activities
for the projects selected
Process of Digiteo’s maturation
programme
9 | March 2011 Evaluation criteria
10. From proposal to selection
• ~ 10 proposals per year
• Preselection classification performed by Digiteo’s scientific committee and
marketing staff
• Coaching : work on the three components technology/marketing/IP of the
proposal and submit presentation for the final selection
• Selection process :
External experts (technology transfer specialists from : industry cluster,
incubator, Paris region, OSEO innovation fund, chamber of commerce, etc.)
Digiteo’s technology transfer committee
Formal selection announced by Digiteo’s steering committee
• Average of 5 projects selected per year
10 | March 2011 Evaluation criteria
11. Selection steps
4. Final decision DIGITEO’s Steering Committee
Technology
Scientific Transfer
Committee Expert Panel
Committee
1. Scientific relevance 2. TT potential 3. Recommandations
Technical differentiation Value creation
11 | March 2011 Evaluation criteria
13. DIGITEO – Method/criteria
• Evaluation criteria used for the OMTE call for projects :
« Product/technology » aspects
Originality/uniqueness and scientific relevance, project feasibility and opportunities
created for the laboratory.
« Market » aspects
Ongoing research contracts and IP related to the project, first applications and users
considered.
« Team » aspects
Support of the laboratories in the process, project manager identified to manage the
project, realism of the planning proposed and evaluation of the risks by the applicants.
• Method :
Evaluation of the applications according to the 12 criteria
Evaluators apply assessment scores from 1 to 3 (3 being the highest)
13 | March 2011 Evaluation criteria
14. DIGITEO – « Product » criteria
1. Originality of the innovation
• Originality/uniqueness in comparison with state-of-the-art ?
• Definition of the future « product » ?
• Positioning compared to competitors ?
2. Scientific relevance of the project
• Compatibility with the research themes covered by Digiteo ?
• Scientific excellence in the field?
• Degree of scientific maturation ( is the technology close to a « product ») ?
3. Project Feasibility
• Technical feasibility of the project?
• Feasibility of the planning, with regard to a transfer?
• Description of the transfer model envisaged (transfer to an industrial partner / creation of start-up) ?
4. Scientific opportunities created for the laboratory
• Consequences of the development on the scientific activities of the lab ?
• Future impact of the project on the lab’s strategy ?
• Impact on the external communications of the lab?
14 | March 2011 Evaluation criteria
15. DIGITEO – « Market » criteria
5. Ongoing research contracts
• Ongoing contracts with industrial partners?
• Other contracts/ scientific activities ?
• Since when? For how long?
6. Intellectual property (patents, know-how)
• Background knowledge of the teams involved ?
• Protection envisaged (foreground) for the new knowledge and software derivating from it;
• Is an IP analysis requested by the teams (analysis of the prior art, patent landscape and « freedom
to operate ») ?
7. First potential applications
• Types/examples of applications ?
• Value Proposition (solution to which problem) ?
• Applications realised by which kind of company (software company, service provider) ?
8. First potential users
• Existing and potential actors/ partners to target for the transfer?
• Example of end-user for the integrated solution ?
• Draft definition of the targeted market (size, segmentation, competitors) ?
15 | March 2011 Evaluation criteria
16. DIGITEO – « Team » criteria
9. Support of the laboratories
• Support of the laboratories involved ?
• Balance between the teams involved in the project (complementarity, synergy) ?
• Commitment to a real transfer ?
10. Project manager in charge
• Profile of the project manager and implication in the project ?
• Capacity of managing all aspects of the project, keeping with the transfer objective?
• Motivation to handle the 3 aspects : technical, IP, marketing ?
11. Realism of the planning
Realism of the planning with regards to the 3 aspects:
• Technical
• IP
• Marketing
12. Evaluation/ consideration of the risks
Identification and management of the risks :
• Technical
• IP
• Marketing
16 | March 2011 Evaluation criteria
17. DIGITEO - Assessment
• Useful tool to be used as a checklist throughout the evaluation process
• The final selection has to include the assessment of the presentation made
in front of the jury. Grade given by the jury is based for 50% on written
application and 50% on the oral presentation.
• The jury should include a majority of external experts
• Final selection : classification/ranking of the presented projects (top 5
selected)
• Some « Digiteo specifics » not to be considered for a generic checklist
17 | March 2011 Evaluation criteria
18. Pros & Cons
PROs CONs
• This practice attempts to • Only a selected number of
formalize methods that are already
criteria are highlighted
in use (most of the time on an ad
hoc basis) • Some criteria may need to
be further developed
• The methodology and associated
tools (call for proposal, criteria,
etc.) are readily available and can
be adapted to each individual case
18 | March 2011 Evaluation criteria
19. Rationale
• Methodology developed by Digiteo in order to manage the incoming flow of
technology transfer proposals
• Need for a consistent set of criteria for all steps of evaluation process,
communicated transparently to all involved partners : project teams, internal
Digiteo evaluators, “technology transfer coaches” and external experts
• Without this methodology, involved parties would get the impression that
projects might be evaluated/selected based on obscure reasons. This would
leave the doors open for debate, accusations for “unfair competition” and
backstage lobbying
19 | March 2011 Evaluation criteria
20. Outcome
• The approach turned out as expected
Final selection (with external experts) is based on relative ranking
among the presented projects
The scoring system is only used for individual evaluation purposes
• Coaching the teams before the final applications, supported by the evaluation
checklist, proved to have a great impact on the oral presentations. Final
applications show significant improvements (in terms of technology transfer
readiness) compared with initial applications.
• Feedbacks indicate that Digiteo community (researchers, external experts)
judge the approach fair and clearly communicated
20 | March 2011 Evaluation criteria
21. Lessons learned
• A strict, fixed procedure can sometimes be counter-productive. For this
reason, scoring was not made compulsory for the final evaluation and used
as a guideline during the selection committee.
• Special attention must be paid to the management of unsuccessful project
teams. A debriefing session is organised with each eliminated team in
order to:
• debrief the teams that were not selected
• clearly communicate the reasons for not being selected
• focus on things to be improved (and how to improve them)
• encourage them to apply again with an enhanced proposal
21 | March 2011 Evaluation criteria
22. Plans for the future
The approach should be further developed/detailed :
Definition of terms
Explanation on how to apply each of the listed criteria (with some
examples)
22 | March 2011 Evaluation criteria
23. Evaluation Criteria in practice with IBBT
IBBT wanted to use evaluation criteria for the following goals:
scoring of new proposals at intake
scoring & ranking of proposals within the same call
quick visual overview of the scoring between several proposals
evaluation or follow-up of a Bootcamp project or incubation project
The approach:
master list made up of all applicable evaluation criteria.
From this master list , subset of criteria for each type of project was defined
Validation of those criteria sets by means of a proof of concept in excel format on
different kind of projects
23 | March 2011 Evaluation criteria
24. Lessons learned
Using evaluation criteria to objectify and structure the assessment of a project
based on different domains, different criteria and sub-criteria proved to be very helpful:
• in panel discussions: as a common base for discussion between the members of an expert
panel to select the most promising business ideas to go through to the coaching track of iBoot
• in coaching trajectories: as feedback tool to the team indicating weak and strong points and
getting constructive feedback of experts
• in follow-up meetings: as a common base for discussion between the innovation board
members about the progress of incubation projects
Since assessments are made by different coaches, innovation board members and
experts, we added to the scoring of 1-4 some illustrative “quotes” . This makes the
nuances between scoring-numbers more justifiable.
Example – Criterion - Suitability of the approach
• Score 1: There is no connection whatsoever between the approach and the intended results
• Score 2: The proposed approach is not very consistent with the intended innovation purpose
• Score 3: The approach is consistent with the innovation purpose
• Score 4: The approach is very well targeted
24 | March 2011 Evaluation criteria
25. Evaluation Criteria in practice with ICBS
At ICBS, the evaluation criteria of Digiteo have been used for the
evaluation of the final projects of the IED programme.
The IE&D program (Innovation, Entrepreneurship & Design) is part of an MBA
course. Students embark on a live entrepreneurial journey and investigate the
commercial potential of their own idea or one being developed by Imperial
Innovations or Innovations Royal College of Art
Modification of the practice needed
before testing:
New criteria are introduced:
- Considerations of financials
(economic feasibility)
- Scalability (production & operation)
25 | March 2011 Evaluation criteria
26. Lessons learned
• Qualitative approach of the indicators, making it easy to quickly evaluate the projects.
• Useful to inform the candidates of the criteria that will be used by the jurors before the
evaluation.
• Good tool to start a systematic evaluation of projects.
• Suggestion to add a general evaluation of the opportunity, as a preliminary step in the
process. Analyze if there is a real problem (demand pull) or if it a technology push
type of opportunity.
• Suggestion to include indicators on the economic feasibility of the project and on
production/scalability. The goal is to see if the predicted production process/time is
reasonable. The scalability can be evaluated by taking into account the time and
money invested for the prototype. As a consequence, some of the items may be
combined in order to introduce these new concepts.
• Re-writing some of the items would make them easier to be understood without the
guideline (the list of concepts).
26 | March 2011 Evaluation criteria
27. Evaluation Criteria in practice with CRP Henri
Tudor
The project definition process within Tudor has been reviewed. The goal was to
organize the earliest phase of project development and include a screening phase
where the project’s idea is assessed, including criteria on technology transfer.
Difference of context : criteria for pre-projects, where the project’s idea is developed,
formalized and validated.
Objective
•Create awareness for Technology Transfer in researcher’s mind when they write their
project’s proposal
•Improve assessment of project’s idea and have a referential for the entire organisation
•Have a better follow-up project to foster maturation project and technology transfer
27 | March 2011 Evaluation criteria
28. Lessons learned
• Putting in place a screening process has created some fears within the
researcher community, including fear of breaking creativity. But finally,
screening meetings are now a place of discussion where the project
proposal can be assessed and enriched. The fact that these criteria are
already used by peers is also a guarantee for the management of the
organisation and researchers.
• Important to spend time on change management, when implementing a
new process that will have a strong impact on the organisation.
• The outcome for Tudor is a better overview on projects developed within
the organisation for all stakeholders and integration of the transfer
perspective at the very beginning of projects.
28 | March 2011 Evaluation criteria
29. Suggested Readings
Link to code book
assessment; method; process; selection
Link to related websites
OMTE call for proposal and projects selected during previous editions:
http://www.digiteo.fr/Digiteo_OMTE
29 | March 2011 Evaluation criteria