Some benchmarking issues<br />
Benchmarking e-Gov<br />Source: Deakin, M. (2010) SCRAN’s Development of a Trans-National Comparator for the Standardisati...
Business<br />Region<br />University<br />City<br />User-profiling<br />Intellectual Capital<br />Monitoring & evaluation<...
Benchmarking ofeGovernment services<br /><ul><li>Using the typology of administrative systems put forward by Torres et al ...
As such they are said to be: consumer-centred, client orientated, citizen-based, consultative and increasingly deliberativ...
Torres et al (2005) have gone on to use these characteristics as a means to review the e-readiness of each European member...
Benchmarking ofeGovernment services<br />Using this index of eGov service development, the exercise uncovers three  “city ...
Steady achiever: offering great potential for the development of the Internet, but with a limited range of online services...
Platform builders: web sites offering the lowest level of services online and benefits to citizens (less than 30%, with li...
EU i2010Benchmarking report<br />6<br />the North Sea now has an average score of  .....<br />
EU i2010Benchmarking report<br />7<br />Source: EC (2009)Smarter, Faster, Better Government<br />
Benchmarking co-design<br />8<br />
9<br />
Upcoming SlideShare
Loading in …5
×

Smart cities benchmarking egov and codesign

816 views

Published on

Brief overview of issues and challenges relating to benchmarking e-government generally and coi-design specifically

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
816
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
18
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • This diagram sets out SCRANs initial representation of the triple-helix partnership for Smart Cities. As can be seen, the three dimensions of SCRAN’s triple-helix DNA lie with the intellectual capital, learning and knowledge of Smart Cities. Set out as a matrix, it is intellectual capital, learning and knowledge that make up the rows and are attached to the work packages which make up the substantive components of Smart Cities. These in turn relate to those partners responsible for developing the intellectual capital, learning communities and knowledge-base in question (the universities, cities and regions respectively).
  • Smart cities benchmarking egov and codesign

    1. 1. Some benchmarking issues<br />
    2. 2. Benchmarking e-Gov<br />Source: Deakin, M. (2010) SCRAN’s Development of a Trans-National Comparator for the Standardisation of eGovernment Services”, in Reddick, C. ed., Comparative E-government: An Examination of E-Government Across Countries, Springer (Integrated Series in Information Systems) <br />2<br />
    3. 3. Business<br />Region<br />University<br />City<br />User-profiling<br />Intellectual Capital<br />Monitoring & evaluation<br />Government<br />Triple-helix of knowledge-based learning and generation of intellectual capital<br />Underpinned by<br />Co-design<br />Networking<br />Multi-channelling<br />Edinburgh Conference: towards Smarter Cities<br />Learning<br />Built on<br />Curve<br />Capacity building<br />Enterprise architecture<br />Business modelling<br />Customisation<br />Multi-channelling<br />User-profiling<br />Capacity building<br />Co-design <br />Monitoring and evaluation<br />Customisation<br />Knowledge<br />Communal<br />Environmental<br />Social<br />Arch<br />March 2009<br />Triple-helix model<br />
    4. 4. Benchmarking ofeGovernment services<br /><ul><li>Using the typology of administrative systems put forward by Torres et al (2005), it is evident the democracies of the North Sea roughly approximate to the Nordic (Norse, Danish, Swedish and Finish) nation-states and are a mix of Anglo-American (UK) and European Continental administrations (those of Netherlands, Germany and Belgium)
    5. 5. As such they are said to be: consumer-centred, client orientated, citizen-based, consultative and increasingly deliberative in their search for efficiency and effectiveness from the development of eGov services
    6. 6. Torres et al (2005) have gone on to use these characteristics as a means to review the e-readiness of each European member-state and assess levels of provision in terms of both the depth and breadth of the service available on city websites. The outcomes of this exercise have in turn been used to construct a “maturity index” of such developments</li></ul>4<br />
    7. 7. Benchmarking ofeGovernment services<br />Using this index of eGov service development, the exercise uncovers three “city groupings” These are the: <br /><ul><li>Innovative group: with a strong position in delivering services online (up to 60% of total) and good situation with respect to the stages of development i.e. informational, interactive and transactional.
    8. 8. Steady achiever: offering great potential for the development of the Internet, but with a limited range of online services (between 45-30%).
    9. 9. Platform builders: web sites offering the lowest level of services online and benefits to citizens (less than 30%, with little more than the power to offer information). </li></ul>Within this classification of city websites, all those within the North Sea fall into the “steady achiever‟ category with modest online presence, either at the informational, interactive, or transactional level of provision. <br />5<br />
    10. 10. EU i2010Benchmarking report<br />6<br />the North Sea now has an average score of .....<br />
    11. 11. EU i2010Benchmarking report<br />7<br />Source: EC (2009)Smarter, Faster, Better Government<br />
    12. 12. Benchmarking co-design<br />8<br />
    13. 13. 9<br />
    14. 14. Co-designTransforming the citizen<br />10<br />Active citizen<br />ICT Novice<br />ICT Expert<br />?<br />Passive citizen<br />“The value of Community Informatics to participatory urban planning and design: a case-study in Helsinki”<br />Joanna Saad-Sulonen and LiisaHorelli, 2010<br />Journal of Community Informatics<br />http://www.ci-journal.net/index.php/ciej/article/view/579/603<br />
    15. 15. The context<br />11<br />
    16. 16. Where does design fit in?<br />12<br />The planning cycle<br />“…a locus for learning and capacity building for the engaged stakeholders”<br />Joanna Saad-Sulonen and LiisaHorelli, 2010 <br />
    17. 17. Why benchmark?<br />Questions<br /><ul><li>What are our peers doing, and how are we placed in relation to them?
    18. 18. What is acceptable good practice, and how are we placed with regard to these practices?
    19. 19. Based upon these comparisons, can we be said to be doing enough?
    20. 20. How do we identify what is required to be done to reach an adequate level of activity?</li></ul>13<br />
    21. 21. Organisational capabilityfor co-design<br />14<br />
    22. 22. Workshop II<br />15<br />

    ×