Why bids fail: Bidding for EU ICT research projects


Published on

This seminar gave an insider’s view on bidding for EU research funds. It focused on EU FP7 IST research instruments (IPs, STREPS etc), what they are, how they are evaluated, why bids fail and what a successful bid looks like.

Published in: Education, Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Why bids fail: Bidding for EU ICT research projects

  1. 1. Why bids fail: Bidding for EU ICT research projects Stephen Brown, 20 May 2010
  2. 2. Overview <ul><li>FP7 ICT call 1 results </li></ul><ul><li>Project types </li></ul><ul><li>The evaluation process </li></ul><ul><li>Evaluation criteria </li></ul><ul><li>Scoring </li></ul><ul><li>Good proposals </li></ul><ul><li>Bad proposals </li></ul>
  3. 3. Most proposals fail
  4. 4. FP7 ICT call1 results <ul><li>188 eligible proposals submitted </li></ul><ul><li>12 proposals funded (52m Euros) </li></ul><ul><li>6% success </li></ul><ul><li>Most proposals fail….. </li></ul><ul><li>…… but not because they are bad </li></ul>
  5. 5. FP7 ICT call1 results Most proposals have to fail Project type Funding requested Funding published IPs 181.5 M€ 20-32.5 M€ STREPs 506.8 M€ 10-22 M€ NoEs 15.6 M€ 5 M€ CSAs 14.2 M€ 2.5 M€
  6. 6. Project types <ul><li>STREPs </li></ul><ul><li>IPs </li></ul><ul><li>NoEs </li></ul><ul><li>CAs </li></ul><ul><li>SAs </li></ul>
  7. 7. STREPs <ul><li>Focused objectives </li></ul><ul><li>Clearly identified problem </li></ul><ul><li>Research and demonstration activities </li></ul><ul><li>Scope for competing approaches to solving problems </li></ul><ul><li>Small scale (2-4 M€ over 1-2 years) </li></ul>
  8. 8. IPs <ul><li>Integration of projects within a coherent set of activities </li></ul><ul><li>Outreach and validation are important </li></ul><ul><li>Include training, innovation, takeup and dissemination activities </li></ul><ul><li>Active partners with substantial roles and clear responsibilities </li></ul><ul><li>Large IPs (average range is 6-9 M€ over 3-4 years) need to show very clearly how they will make a significant impact in their target area </li></ul>
  9. 9. NoEs <ul><li>Aim is the durable integration of high calibre research capacity </li></ul><ul><li>NoEs should involve the stakeholders, especially industry </li></ul><ul><li>Funding is for convergence and embedding, not research </li></ul><ul><li>Size and funding of NoEs much smaller than in FP6 (2-3M€ over 3 years) </li></ul>
  10. 10. CSAs <ul><li>Bringing researchers together either in new areas (as forerunner of eventual NoEs) </li></ul><ul><li>Supporting workshops and communities of practice – eg in creating framework conditions for take up of research work </li></ul><ul><li>Support for building and maintaining the body of evidence of research </li></ul><ul><li><1M € each </li></ul>
  11. 11. The evaluation process
  12. 12. The evaluation process <ul><li>STREPs </li></ul><ul><li>Individual expert </li></ul><ul><li>Consensus Group </li></ul><ul><li>Panel meeting </li></ul><ul><li>IPs & NoEs </li></ul><ul><li>Individual expert </li></ul><ul><li>Consensus Group </li></ul><ul><li>Interim panel meeting </li></ul><ul><li>Hearing </li></ul><ul><li>Panel meeting </li></ul>Plenty of opportunities to fail
  13. 13. Evaluation criteria
  14. 14. S/T QUALITY “ Scientific and/or technological excellence (relevant to the topics addressed by the call)” IMPLEMENTATION “ Quality and efficiency of the implementation and the management” IMPACT “ Potential impact through the development, dissemination and use of project results” <ul><li>Soundness of concept, and quality of objectives </li></ul><ul><li>• Progress beyond the state-of-the-art </li></ul><ul><li>• Quality and effectiveness of the S/T methodology and associated work plan </li></ul><ul><li>Appropriateness of the management structure and procedures </li></ul><ul><li>Quality of the consortium as a whole (including complementarity, balance) </li></ul><ul><li>Quality and relevant experience of the individual participants </li></ul><ul><li>Appropriateness of the allocation and justification of the resources to be committed (budget, staff, equipment) </li></ul><ul><li>Contribution, at the European and / or international level, to the expected impacts listed in the work programme under relevant topic/activity </li></ul><ul><li>Appropriateness of measures for the dissemination and/or exploitation of project results, and management of intellectual property </li></ul>
  15. 15. Scoring <ul><li>0 - The proposal fails to address the criterion under examination or cannot be judged due to missing or incomplete information. </li></ul><ul><li>1 - Very poor . The criterion is addressed in a cursory and unsatisfactory manner. </li></ul><ul><li>2 - Poor . S erious inherent weaknesses in relation to the criterion in question. </li></ul><ul><li>3 - Fair . While the proposal broadly addresses the criterion, there are </li></ul><ul><li>significant weaknesses that would need correcting. </li></ul><ul><li>4 - Good . The proposal addresses the criterion well, although certain </li></ul><ul><li>improvements are possible. </li></ul><ul><li>5 - Excellent . The proposal successfully addresses all relevant aspects of the criterion in question. Any shortcomings are minor. </li></ul>
  16. 16. Funding thresholds <ul><li>Proposals must score at least 3 on any criterion to be funded </li></ul><ul><li>Proposals scoring 10 or above can be considered for funding </li></ul><ul><li>35% above threshold but not funded </li></ul><ul><li>Proposals scoring 13.5 or above are usually considered for funding </li></ul><ul><li>You need drop only 3 half marks to fail </li></ul>
  17. 17. FP7 ICT call1 results <ul><li>67% failed on multiple thresholds </li></ul><ul><li>of these 43% failed on criterion 1 – scientific excellence relevant to the objectives </li></ul><ul><li>52% IPs and 40% STREPS failed criterion 3 - impact </li></ul>
  18. 18. Don’t give up
  19. 19. What good proposals look like <ul><li>Describe your problem, explain why it is relevant and how you will tackle it </li></ul><ul><li>Describe the specific state-of-the-art with referenced evidence , as well as the technical baseline , and expected advancements against which progress can be measured </li></ul><ul><li>Show you understand the state of the art – don’t just list projects and articles </li></ul><ul><li>Check the timelines and anticipated outputs of ongoing research in defining your starting point and the advances you will make – don’t replicate existing work </li></ul>
  20. 20. What good proposals look like <ul><li>Explain how you will ensure impact </li></ul><ul><li>Adopt a scientifically sound approach to involving users in the research, including to the assessment and validation necessary to build the evidence of impact </li></ul><ul><li>Find the right partners – not necessarily the nearest or most convenient. Do justice to the multi-disciplinary nature of the area – ensure the expertise and the roles are balanced and appropriate. </li></ul><ul><li>Cost out work packages clearly and realistically </li></ul>
  21. 21. Relevance
  22. 22. Work Programme <ul><li>Target outcomes </li></ul><ul><li>Fifth call results </li></ul>
  23. 23. Common mistakes
  24. 24. <ul><li>Provide yet another training solution for a particular set of users (eg training for engineers) with no new work on how people acquire skills and competences, in different contexts </li></ul><ul><li>Develop a Learning Management System, Content Delivery Platform or VLE – these are mainstream eLearning products </li></ul><ul><li>Develop something for a specific language, geography, history and don’t justify how ICTs will improve learning in that field </li></ul>
  25. 25. <ul><li>Describe a “technology driven” type of project </li></ul><ul><li>Fail to leverage a balance of research across the contributing disciplines </li></ul><ul><li>Fail to identify what the different disciplines contribute </li></ul><ul><li>Produce a proposal that tries to do everything and is just not credible. Often less is more </li></ul>
  26. 26. Critical questions you should ask yourselves <ul><li>Does the proposal address the right published target outcomes? </li></ul><ul><li>Does the proposal address a new problem or offer different and innovative insights into an existing one? </li></ul><ul><li>How far is the problem you intend to address already being tackled elsewhere? </li></ul><ul><li>Which communities are likely to benefit from the project / how are they involved? </li></ul><ul><li>What will the benefits / impact of the project be? </li></ul><ul><li>What are the challenges and potential risks and how are they tackled? </li></ul>
  27. 27. Further information <ul><li>Comprehensive EC guide to “How to fail” </li></ul><ul><ul><li>ftp://ftp.cordis.europa.eu/pub/ist/docs/telearn/what-not-to-do_en.pdf </li></ul></ul><ul><li>Introduction to Cultural Heritage & Technology Enhanced Learning, including links to programme descriptions, publications that describe currently funded projects and links to commission / project web sites and contacts. </li></ul><ul><ul><li>http://cordis.europa.eu/fp7/ict/telearn-digicult/home_en.html </li></ul></ul>
  28. 28. Stephen Brown For further information contact