URBACT 2:  Methodologies and lessons from URBACT 1
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

URBACT 2: Methodologies and lessons from URBACT 1

  • 770 views
Uploaded on

Presentation by Paul Soto for URBACT 2 kickoff meeting April 2008

Presentation by Paul Soto for URBACT 2 kickoff meeting April 2008

More in: Business , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
770
On Slideshare
768
From Embeds
2
Number of Embeds
1

Actions

Shares
Downloads
1
Comments
0
Likes
0

Embeds 2

http://www.linkedin.com 2

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. From methodologies to outputs Lessons from URBACT 1
  • 2. Urbact 1
    • Wide range of methodologies tested
    • Some networks focus heavily on learning methodologies – UDIEX/ALEP
    • Urbact Methods and Tools (document library):
      • http://urbact.eu/document-library/urbact-methods-and-tools.html
    • 3 phases: getting prepared – exchanges – dissemination.
    • 2 basic building blocks – thematic seminars + case studies
  • 3. Urbact Methods and Tools
    • Getting prepared
    • Common understanding of needs and aims
    • Selection of relevant practices
    • Selection of relevant people
    • Exchanges
    • Seminars + field trips
    • Study trips
    • Coaching – mentoring
    • Intermediate outputs
    • Dissemination
    • Thematic reports
    • Case studies and CS reports
    • Executive summaries + brochures
    • Research reports
    • Toolkits
    • Websites
    • Dissemination events
  • 4. Areas for improvement
    • Critical points
    • Need to clarify aims and target public
    • Domination of visibility motive.
    • Products for learning confused with dissemination
    • Little account of stakeholders
    • Policy relevance (politicians)
    • Huge variation in quality – lack of common standards , language and style
    • Less emphasis on preparation and follow-up
    • Isolated outputs. No critical mass
    • Policy relevance?
    • Static
    • .
    • Action
    • Adapt methodologies and products to aim and target publc
    • Visibility +
    • Clear separation of intermediate and final
    • Include stakeholder dimensions – 360% approaches
    • Common core elements, benchmarks, models for certain key methods/outputs
    • Allow time for preparation and follow up
    • Specifying expected results and monitoring
    • More synergy between products and activities – itineraries.
    • More flexible and agile use of IT and multimedia
  • 5. Questions
    • Where do you think we can improve and how?
    • How can we improve learning between us?
    • How can we increase quality, achieve certain commmon + comparable standards and also allow for innovation?